disallow rule

by mrahier96 mrahier96 No Comments

Web ASseMbly : how to understand, debug and mess with it

Introduction

The goal of this post is to provide basics of Web Assessmbly (abbreviated “Wasm”), so the next time you encounter it, you will be able to understand it and test it.

I will not cover all the available instructions as documentation exists for that (see a bit below), but I will try to give you the keys to understand any Wasm code.

Documentation

The documentation for all the instructions in Wasm is available here: https://github.com/sunfishcode/wasm-reference-manual/blob/master/WebAssembly.md

Moreover, a good picture of how to write Wasm code can be found here:
https://learnxinyminutes.com/docs/wasm/

If there is documentation, why this post?

It’s simple:

  • Not everybody has basics on how to read assembly code and the tools to handle Wasm are not numerous
  • The mentioned documentation (which is the official one) is, in my opinion, not the most comprehensive one.

So the goal is to address these two points.
Read more

by mrahier96 mrahier96 No Comments

What is Format Preserving Encryption (FPE) ?

Format Preserving Encryption, named FPE from here, is a particular form of encryption with a constraint of preserving the initial format. In other words, the output should keep the same format as the input. The format of data can be defined by a charset (named the domain in the article below) and a length. Here are some examples:

  • A 16-digit card number in a 16-digit number.
  • A 12 Hexadecimal digit mac address in a 12 Hex digit number.
  • A mail address to another mail address.

Read more

by Excellium SA Excellium SA No Comments

Robots.txt & cybersecurity: Protecting your web applications from hackers

What is a Robots.txt file?

A robots.txt file is a simple text file that should be available at the root level of the application, like the one on the Excellium website. This file is here to allow or avoid the search engine robots to crawl some parts of the website.

For that example, the robots.txt file provides the website’s sitemap to help search engines browse all links more easily than browsing each page one by one and discovering links recursively. That also allows indexing the pages that don’t have external references to them. Read more

Top