fbpx
Wikipedia

Decompiler

A decompiler is a computer program that translates an executable file to high-level source code. It does therefore the opposite of a typical compiler, which translates a high-level language to a low-level language. While disassemblers translate an executable into assembly language, decompilers go a step further and translate the code into a higher level language such as C or Java, requiring more sophisticated techniques. Decompilers are usually unable to perfectly reconstruct the original source code, thus will frequently produce obfuscated code. Nonetheless, they remain an important tool in the reverse engineering of computer software.

Introduction edit

The term decompiler is most commonly applied to a program which translates executable programs (the output from a compiler) into source code in a (relatively) high level language which, when compiled, will produce an executable whose behavior is the same as the original executable program. By comparison, a disassembler translates an executable program into assembly language (and an assembler could be used for assembling it back into an executable program).

Decompilation is the act of using a decompiler, although the term can also refer to the output of a decompiler. It can be used for the recovery of lost source code, and is also useful in some cases for computer security, interoperability and error correction.[1] The success of decompilation depends on the amount of information present in the code being decompiled and the sophistication of the analysis performed on it. The bytecode formats used by many virtual machines (such as the Java Virtual Machine or the .NET Framework Common Language Runtime) often include extensive metadata and high-level features that make decompilation quite feasible. The application of debug data, i.e. debug-symbols, may enable to reproduce the original names of variables and structures and even the line numbers. Machine language without such metadata or debug data is much harder to decompile.[2]

Some compilers and post-compilation tools produce obfuscated code (that is, they attempt to produce output that is very difficult to decompile, or that decompiles to confusing output). This is done to make it more difficult to reverse engineer the executable.

While decompilers are normally used to (re-)create source code from binary executables, there are also decompilers to turn specific binary data files into human-readable and editable sources.[3][4]

The success level achieved by decompilers can be impacted by various factors. These include the abstraction level of the source language,  if the object code contains explicit class structure information, it aids the decompilation process. Descriptive information, especially with naming details, also accelerates the compiler's work. Moreover, less optimized code is quicker to decompile since optimization can cause greater deviation from the original code.[5]

Design edit

Decompilers can be thought of as composed of a series of phases each of which contributes specific aspects of the overall decompilation process.

Loader edit

The first decompilation phase loads and parses the input machine code or intermediate language program's binary file format. It should be able to discover basic facts about the input program, such as the architecture (Pentium, PowerPC, etc.) and the entry point. In many cases, it should be able to find the equivalent of the main function of a C program, which is the start of the user written code. This excludes the runtime initialization code, which should not be decompiled if possible. If available the symbol tables and debug data are also loaded. The front end may be able to identify the libraries used even if they are linked with the code, this will provide library interfaces. If it can determine the compiler or compilers used it may provide useful information in identifying code idioms.[6]

Disassembly edit

The next logical phase is the disassembly of machine code instructions into a machine independent intermediate representation (IR). For example, the Pentium machine instruction

mov eax, [ebx+0x04] 

might be translated to the IR

eax := m[ebx+4]; 

Idioms edit

Idiomatic machine code sequences are sequences of code whose combined semantics are not immediately apparent from the instructions' individual semantics. Either as part of the disassembly phase, or as part of later analyses, these idiomatic sequences need to be translated into known equivalent IR. For example, the x86 assembly code:

 cdq eax  ; edx is set to the sign-extension≠edi,edi +(tex)push  xor eax, edx  sub eax, edx 

could be translated to

eax  := abs(eax); 

Some idiomatic sequences are machine independent; some involve only one instruction. For example, xor eax, eax clears the eax register (sets it to zero). This can be implemented with a machine independent simplification rule, such as a = 0.

In general, it is best to delay detection of idiomatic sequences if possible, to later stages that are less affected by instruction ordering. For example, the instruction scheduling phase of a compiler may insert other instructions into an idiomatic sequence, or change the ordering of instructions in the sequence. A pattern matching process in the disassembly phase would probably not recognize the altered pattern. Later phases group instruction expressions into more complex expressions, and modify them into a canonical (standardized) form, making it more likely that even the altered idiom will match a higher level pattern later in the decompilation.

It is particularly important to recognize the compiler idioms for subroutine calls, exception handling, and switch statements. Some languages also have extensive support for strings or long integers.

Program analysis edit

Various program analyses can be applied to the IR. In particular, expression propagation combines the semantics of several instructions into more complex expressions. For example,

 mov eax,[ebx+0x04]  add eax,[ebx+0x08]  sub [ebx+0x0C],eax 

could result in the following IR after expression propagation:

m[ebx+12]  := m[ebx+12] - (m[ebx+4] + m[ebx+8]); 

The resulting expression is more like high level language, and has also eliminated the use of the machine register eax. Later analyses may eliminate the ebx register.

Data flow analysis edit

The places where register contents are defined and used must be traced using data flow analysis. The same analysis can be applied to locations that are used for temporaries and local data. A different name can then be formed for each such connected set of value definitions and uses. It is possible that the same local variable location was used for more than one variable in different parts of the original program. Even worse it is possible for the data flow analysis to identify a path whereby a value may flow between two such uses even though it would never actually happen or matter in reality. This may in bad cases lead to needing to define a location as a union of types. The decompiler may allow the user to explicitly break such unnatural dependencies which will lead to clearer code. This of course means a variable is potentially used without being initialized and so indicates a problem in the original program.[citation needed]

Type analysis edit

A good machine code decompiler will perform type analysis. Here, the way registers or memory locations are used result in constraints on the possible type of the location. For example, an and instruction implies that the operand is an integer; programs do not use such an operation on floating point values (except in special library code) or on pointers. An add instruction results in three constraints, since the operands may be both integer, or one integer and one pointer (with integer and pointer results respectively; the third constraint comes from the ordering of the two operands when the types are different).[7]

Various high level expressions can be recognized which trigger recognition of structures or arrays. However, it is difficult to distinguish many of the possibilities, because of the freedom that machine code or even some high level languages such as C allow with casts and pointer arithmetic.

The example from the previous section could result in the following high level code:

struct T1 *ebx;  struct T1 {  int v0004;  int v0008;  int v000C;  }; ebx->v000C -= ebx->v0004 + ebx->v0008; 

Structuring edit

The penultimate decompilation phase involves structuring of the IR into higher level constructs such as while loops and if/then/else conditional statements. For example, the machine code

 xor eax, eax l0002:  or ebx, ebx  jge l0003  add eax,[ebx]  mov ebx,[ebx+0x4]  jmp l0002 l0003:  mov [0x10040000],eax 

could be translated into:

eax = 0; while (ebx < 0) {  eax += ebx->v0000;  ebx = ebx->v0004; } v10040000 = eax; 

Unstructured code is more difficult to translate into structured code than already structured code. Solutions include replicating some code, or adding boolean variables.[8]

Code generation edit

The final phase is the generation of the high level code in the back end of the decompiler. Just as a compiler may have several back ends for generating machine code for different architectures, a decompiler may have several back ends for generating high level code in different high level languages.

Just before code generation, it may be desirable to allow an interactive editing of the IR, perhaps using some form of graphical user interface. This would allow the user to enter comments, and non-generic variable and function names. However, these are almost as easily entered in a post decompilation edit. The user may want to change structural aspects, such as converting a while loop to a for loop. These are less readily modified with a simple text editor, although source code refactoring tools may assist with this process. The user may need to enter information that failed to be identified during the type analysis phase, e.g. modifying a memory expression to an array or structure expression. Finally, incorrect IR may need to be corrected, or changes made to cause the output code to be more readable.

Other techniques edit

Decompilers using neural networks have been developed. Such a decompiler may be trained by machine learning to improve its accuracy over time.[9]

Legality edit

The majority of computer programs are covered by copyright laws. Although the precise scope of what is covered by copyright differs from region to region, copyright law generally provides the author (the programmer(s) or employer) with a collection of exclusive rights to the program.[10] These rights include the right to make copies, including copies made into the computer’s RAM (unless creating such a copy is essential for using the program).[11] Since the decompilation process involves making multiple such copies, it is generally prohibited without the authorization of the copyright holder. However, because decompilation is often a necessary step in achieving software interoperability, copyright laws in both the United States and Europe permit decompilation to a limited extent.

In the United States, the copyright fair use defence has been successfully invoked in decompilation cases. For example, in Sega v. Accolade, the court held that Accolade could lawfully engage in decompilation in order to circumvent the software locking mechanism used by Sega's game consoles.[12] Additionally, the Digital Millennium Copyright Act (PUBLIC LAW 105–304[13]) has proper exemptions for both Security Testing and Evaluation in §1201(i), and Reverse Engineering in §1201(f).[14]

In Europe, the 1991 Software Directive explicitly provides for a right to decompile in order to achieve interoperability. The result of a heated debate between, on the one side, software protectionists, and, on the other, academics as well as independent software developers, Article 6 permits decompilation only if a number of conditions are met:

  • First, a person or entity must have a licence to use the program to be decompiled.
  • Second, decompilation must be necessary to achieve interoperability with the target program or other programs. Interoperability information should therefore not be readily available, such as through manuals or API documentation. This is an important limitation. The necessity must be proven by the decompiler. The purpose of this important limitation is primarily to provide an incentive for developers to document and disclose their products' interoperability information.[15]
  • Third, the decompilation process must, if possible, be confined to the parts of the target program relevant to interoperability. Since one of the purposes of decompilation is to gain an understanding of the program structure, this third limitation may be difficult to meet. Again, the burden of proof is on the decompiler.

In addition, Article 6 prescribes that the information obtained through decompilation may not be used for other purposes and that it may not be given to others.

Overall, the decompilation right provided by Article 6 codifies what is claimed to be common practice in the software industry. Few European lawsuits are known to have emerged from the decompilation right. This could be interpreted as meaning one of three things:

  1. ) the decompilation right is not used frequently and the decompilation right may therefore have been unnecessary,
  2. ) the decompilation right functions well and provides sufficient legal certainty not to give rise to legal disputes or
  3. ) illegal decompilation goes largely undetected.

In a report of 2000 regarding implementation of the Software Directive by the European member states, the European Commission seemed to support the second interpretation.[16]

See also edit

Java decompilers edit

Other decompilers edit

References edit

  1. ^ Van Emmerik, Mike (2005-04-29). "Why Decompilation". Program-transformation.org. from the original on 2010-09-22. Retrieved 2010-09-15.
  2. ^ Miecznikowski, Jerome; Hendren, Laurie (2002). "Decompiling Java Bytecode: Problems, Traps and Pitfalls". In Horspool, R. Nigel (ed.). Compiler Construction: 11th International Conference, proceedings / CC 2002. Springer-Verlag. pp. 111–127. ISBN 3-540-43369-4.
  3. ^ Paul, Matthias R. (2001-06-10) [1995]. "Format description of DOS, OS/2, and Windows NT .CPI, and Linux .CP files" (CPI.LST file) (1.30 ed.). from the original on 2016-04-20. Retrieved 2016-08-20.
  4. ^ Paul, Matthias R. (2002-05-13). "[fd-dev] mkeyb". freedos-dev. Archived from the original on 2018-09-10. Retrieved 2018-09-10. […] .CPI & .CP codepage file analyzer, validator and decompiler […] Overview on /Style parameters: […] ASM source include files […] Standalone ASM source files […] Modular ASM source files […]
  5. ^ Elo, Tommi; Hasu, Tero (2003). "Detecting Co-Derivative Source Code – An Overview" (PDF). Teknisjuridinen selvitys tekijänoikeudesta tietokoneohjelman lähdekoodiin Suomessa ja Euroopassa.
  6. ^ Cifuentes, Cristina; Gough, K. John (July 1995). "Decompilation of Binary Programs". Software: Practice and Experience. 25 (7): 811–829. CiteSeerX 10.1.1.14.8073. doi:10.1002/spe.4380250706. S2CID 8229401.
  7. ^ Mycroft, Alan (1999). "Type-Based Decompilation". In Swierstra, S. Doaitse (ed.). Programming languages and systems: 8th European Symposium on Programming Languages and Systems. Springer-Verlag. pp. 208–223. ISBN 3-540-65699-5.
  8. ^ Cifuentes, Cristina (1994). "Chapter 6". Reverse Compilation Techniques (PDF) (PhD thesis). Queensland University of Technology. (PDF) from the original on 2016-11-22. Retrieved 2019-12-21.)
  9. ^ Tian, Yuandong; Fu, Cheng (2021-01-27). "Introducing N-Bref: a neural-based decompiler framework". Retrieved 2022-12-30.
  10. ^ Rowland, Diane (2005). Information technology law (3 ed.). Cavendish. ISBN 1-85941-756-6.
  11. ^ "U.S. Copyright Office - Copyright Law: Chapter 1". from the original on 2017-12-25. Retrieved 2014-04-10.
  12. ^ "The Legality of Decompilation". Program-transformation.org. 2004-12-03. from the original on 2010-09-22. Retrieved 2010-09-15.
  13. ^ "Digital Millennium Copyright Act" (PDF). US Congress. 1998-10-28. (PDF) from the original on 2013-12-10. Retrieved 2013-11-15.
  14. ^ "Federal Register :: Request Access". from the original on 2022-01-25. Retrieved 2021-01-31.
  15. ^ Czarnota, Bridget; Hart, Robert J. (1991). Legal protection of computer programs in Europe: a guide to the EC directive. London: Butterworths Tolley. ISBN 0-40600542-7.
  16. ^ "Report from the Commission to the Council, the European Parliament and the Economic and Social Committee on the implementation and effects of Directive 91/250/EEC on the legal protection of computer programs". from the original on 2020-12-04. Retrieved 2020-12-26.

External links edit

  • Decompilers and Disassemblers at Curlie

decompiler, this, article, multiple, issues, please, help, improve, discuss, these, issues, talk, page, learn, when, remove, these, template, messages, this, article, needs, updated, reason, given, there, information, regarding, modern, decompilation, techniqu. This article has multiple issues Please help improve it or discuss these issues on the talk page Learn how and when to remove these template messages This article needs to be updated The reason given is There is no information regarding modern decompilation techniques with various programming languages and executable formats Please help update this article to reflect recent events or newly available information September 2022 This article s lead section may be too short to adequately summarize the key points Please consider expanding the lead to provide an accessible overview of all important aspects of the article May 2022 This article uses citations that link to broken or outdated sources Please improve the article by addressing link rot or discuss this issue on the talk page May 2022 Learn how and when to remove this message Learn how and when to remove this message A decompiler is a computer program that translates an executable file to high level source code It does therefore the opposite of a typical compiler which translates a high level language to a low level language While disassemblers translate an executable into assembly language decompilers go a step further and translate the code into a higher level language such as C or Java requiring more sophisticated techniques Decompilers are usually unable to perfectly reconstruct the original source code thus will frequently produce obfuscated code Nonetheless they remain an important tool in the reverse engineering of computer software Contents 1 Introduction 2 Design 2 1 Loader 2 2 Disassembly 2 3 Idioms 2 4 Program analysis 2 5 Data flow analysis 2 6 Type analysis 2 7 Structuring 2 8 Code generation 2 9 Other techniques 3 Legality 4 See also 4 1 Java decompilers 4 2 Other decompilers 5 References 6 External linksIntroduction editThe term decompiler is most commonly applied to a program which translates executable programs the output from a compiler into source code in a relatively high level language which when compiled will produce an executable whose behavior is the same as the original executable program By comparison a disassembler translates an executable program into assembly language and an assembler could be used for assembling it back into an executable program Decompilation is the act of using a decompiler although the term can also refer to the output of a decompiler It can be used for the recovery of lost source code and is also useful in some cases for computer security interoperability and error correction 1 The success of decompilation depends on the amount of information present in the code being decompiled and the sophistication of the analysis performed on it The bytecode formats used by many virtual machines such as the Java Virtual Machine or the NET Framework Common Language Runtime often include extensive metadata and high level features that make decompilation quite feasible The application of debug data i e debug symbols may enable to reproduce the original names of variables and structures and even the line numbers Machine language without such metadata or debug data is much harder to decompile 2 Some compilers and post compilation tools produce obfuscated code that is they attempt to produce output that is very difficult to decompile or that decompiles to confusing output This is done to make it more difficult to reverse engineer the executable While decompilers are normally used to re create source code from binary executables there are also decompilers to turn specific binary data files into human readable and editable sources 3 4 The success level achieved by decompilers can be impacted by various factors These include the abstraction level of the source language if the object code contains explicit class structure information it aids the decompilation process Descriptive information especially with naming details also accelerates the compiler s work Moreover less optimized code is quicker to decompile since optimization can cause greater deviation from the original code 5 Design editDecompilers can be thought of as composed of a series of phases each of which contributes specific aspects of the overall decompilation process Loader edit The first decompilation phase loads and parses the input machine code or intermediate language program s binary file format It should be able to discover basic facts about the input program such as the architecture Pentium PowerPC etc and the entry point In many cases it should be able to find the equivalent of the main function of a C program which is the start of the user written code This excludes the runtime initialization code which should not be decompiled if possible If available the symbol tables and debug data are also loaded The front end may be able to identify the libraries used even if they are linked with the code this will provide library interfaces If it can determine the compiler or compilers used it may provide useful information in identifying code idioms 6 Disassembly edit The next logical phase is the disassembly of machine code instructions into a machine independent intermediate representation IR For example the Pentium machine instruction mov eax ebx 0x04 might be translated to the IR eax m ebx 4 Idioms edit Idiomatic machine code sequences are sequences of code whose combined semantics are not immediately apparent from the instructions individual semantics Either as part of the disassembly phase or as part of later analyses these idiomatic sequences need to be translated into known equivalent IR For example the x86 assembly code cdq eax edx is set to the sign extension edi edi tex push xor eax edx sub eax edx could be translated to eax abs eax Some idiomatic sequences are machine independent some involve only one instruction For example span class nf xor span span class w span span class nb eax span span class p span span class w span span class nb eax span clears the eax register sets it to zero This can be implemented with a machine independent simplification rule such as a 0 In general it is best to delay detection of idiomatic sequences if possible to later stages that are less affected by instruction ordering For example the instruction scheduling phase of a compiler may insert other instructions into an idiomatic sequence or change the ordering of instructions in the sequence A pattern matching process in the disassembly phase would probably not recognize the altered pattern Later phases group instruction expressions into more complex expressions and modify them into a canonical standardized form making it more likely that even the altered idiom will match a higher level pattern later in the decompilation It is particularly important to recognize the compiler idioms for subroutine calls exception handling and switch statements Some languages also have extensive support for strings or long integers Program analysis edit Various program analyses can be applied to the IR In particular expression propagation combines the semantics of several instructions into more complex expressions For example mov eax ebx 0x04 add eax ebx 0x08 sub ebx 0x0C eax could result in the following IR after expression propagation m ebx 12 m ebx 12 m ebx 4 m ebx 8 The resulting expression is more like high level language and has also eliminated the use of the machine register eax Later analyses may eliminate the ebx register Data flow analysis edit The places where register contents are defined and used must be traced using data flow analysis The same analysis can be applied to locations that are used for temporaries and local data A different name can then be formed for each such connected set of value definitions and uses It is possible that the same local variable location was used for more than one variable in different parts of the original program Even worse it is possible for the data flow analysis to identify a path whereby a value may flow between two such uses even though it would never actually happen or matter in reality This may in bad cases lead to needing to define a location as a union of types The decompiler may allow the user to explicitly break such unnatural dependencies which will lead to clearer code This of course means a variable is potentially used without being initialized and so indicates a problem in the original program citation needed Type analysis edit A good machine code decompiler will perform type analysis Here the way registers or memory locations are used result in constraints on the possible type of the location For example an and instruction implies that the operand is an integer programs do not use such an operation on floating point values except in special library code or on pointers An add instruction results in three constraints since the operands may be both integer or one integer and one pointer with integer and pointer results respectively the third constraint comes from the ordering of the two operands when the types are different 7 Various high level expressions can be recognized which trigger recognition of structures or arrays However it is difficult to distinguish many of the possibilities because of the freedom that machine code or even some high level languages such as C allow with casts and pointer arithmetic The example from the previous section could result in the following high level code struct T1 ebx struct T1 int v0004 int v0008 int v000C ebx gt v000C ebx gt v0004 ebx gt v0008 Structuring edit The penultimate decompilation phase involves structuring of the IR into higher level constructs such as while loops and if then else conditional statements For example the machine code xor eax eax l0002 or ebx ebx jge l0003 add eax ebx mov ebx ebx 0x4 jmp l0002 l0003 mov 0x10040000 eax could be translated into eax 0 while ebx lt 0 eax ebx gt v0000 ebx ebx gt v0004 v10040000 eax Unstructured code is more difficult to translate into structured code than already structured code Solutions include replicating some code or adding boolean variables 8 Code generation edit The final phase is the generation of the high level code in the back end of the decompiler Just as a compiler may have several back ends for generating machine code for different architectures a decompiler may have several back ends for generating high level code in different high level languages Just before code generation it may be desirable to allow an interactive editing of the IR perhaps using some form of graphical user interface This would allow the user to enter comments and non generic variable and function names However these are almost as easily entered in a post decompilation edit The user may want to change structural aspects such as converting a while loop to a for loop These are less readily modified with a simple text editor although source code refactoring tools may assist with this process The user may need to enter information that failed to be identified during the type analysis phase e g modifying a memory expression to an array or structure expression Finally incorrect IR may need to be corrected or changes made to cause the output code to be more readable Other techniques edit Decompilers using neural networks have been developed Such a decompiler may be trained by machine learning to improve its accuracy over time 9 Legality editThis article possibly contains original research Please improve it by verifying the claims made and adding inline citations Statements consisting only of original research should be removed April 2013 Learn how and when to remove this message This section needs attention from an expert in Law See the talk page for details WikiProject Law may be able to help recruit an expert March 2011 The majority of computer programs are covered by copyright laws Although the precise scope of what is covered by copyright differs from region to region copyright law generally provides the author the programmer s or employer with a collection of exclusive rights to the program 10 These rights include the right to make copies including copies made into the computer s RAM unless creating such a copy is essential for using the program 11 Since the decompilation process involves making multiple such copies it is generally prohibited without the authorization of the copyright holder However because decompilation is often a necessary step in achieving software interoperability copyright laws in both the United States and Europe permit decompilation to a limited extent In the United States the copyright fair use defence has been successfully invoked in decompilation cases For example in Sega v Accolade the court held that Accolade could lawfully engage in decompilation in order to circumvent the software locking mechanism used by Sega s game consoles 12 Additionally the Digital Millennium Copyright Act PUBLIC LAW 105 304 13 has proper exemptions for both Security Testing and Evaluation in 1201 i and Reverse Engineering in 1201 f 14 In Europe the 1991 Software Directive explicitly provides for a right to decompile in order to achieve interoperability The result of a heated debate between on the one side software protectionists and on the other academics as well as independent software developers Article 6 permits decompilation only if a number of conditions are met First a person or entity must have a licence to use the program to be decompiled Second decompilation must be necessary to achieve interoperability with the target program or other programs Interoperability information should therefore not be readily available such as through manuals or API documentation This is an important limitation The necessity must be proven by the decompiler The purpose of this important limitation is primarily to provide an incentive for developers to document and disclose their products interoperability information 15 Third the decompilation process must if possible be confined to the parts of the target program relevant to interoperability Since one of the purposes of decompilation is to gain an understanding of the program structure this third limitation may be difficult to meet Again the burden of proof is on the decompiler In addition Article 6 prescribes that the information obtained through decompilation may not be used for other purposes and that it may not be given to others Overall the decompilation right provided by Article 6 codifies what is claimed to be common practice in the software industry Few European lawsuits are known to have emerged from the decompilation right This could be interpreted as meaning one of three things the decompilation right is not used frequently and the decompilation right may therefore have been unnecessary the decompilation right functions well and provides sufficient legal certainty not to give rise to legal disputes or illegal decompilation goes largely undetected In a report of 2000 regarding implementation of the Software Directive by the European member states the European Commission seemed to support the second interpretation 16 See also editDisassembler Binary recompiler Linker computing Abstract interpretation Resource editor Java decompilers edit Mocha decompiler JD Decompiler JAD decompiler Other decompilers edit NET Reflector JEB Decompiler Android Dalvik Intel x86 ARM MIPS WebAssembly Ethereum Ghidra IDA Pro which includes a decompiler as an optional paid feature Binary Ninja uncompyle6 Python Bytecode from 1 0 to 3 8 References edit Van Emmerik Mike 2005 04 29 Why Decompilation Program transformation org Archived from the original on 2010 09 22 Retrieved 2010 09 15 Miecznikowski Jerome Hendren Laurie 2002 Decompiling Java Bytecode Problems Traps and Pitfalls In Horspool R Nigel ed Compiler Construction 11th International Conference proceedings CC 2002 Springer Verlag pp 111 127 ISBN 3 540 43369 4 Paul Matthias R 2001 06 10 1995 Format description of DOS OS 2 and Windows NT CPI and Linux CP files CPI LST file 1 30 ed Archived from the original on 2016 04 20 Retrieved 2016 08 20 Paul Matthias R 2002 05 13 fd dev mkeyb freedos dev Archived from the original on 2018 09 10 Retrieved 2018 09 10 CPI amp CP codepage file analyzer validator and decompiler Overview on Style parameters ASM source include files Standalone ASM source files Modular ASM source files Elo Tommi Hasu Tero 2003 Detecting Co Derivative Source Code An Overview PDF Teknisjuridinen selvitys tekijanoikeudesta tietokoneohjelman lahdekoodiin Suomessa ja Euroopassa Cifuentes Cristina Gough K John July 1995 Decompilation of Binary Programs Software Practice and Experience 25 7 811 829 CiteSeerX 10 1 1 14 8073 doi 10 1002 spe 4380250706 S2CID 8229401 Mycroft Alan 1999 Type Based Decompilation In Swierstra S Doaitse ed Programming languages and systems 8th European Symposium on Programming Languages and Systems Springer Verlag pp 208 223 ISBN 3 540 65699 5 Cifuentes Cristina 1994 Chapter 6 Reverse Compilation Techniques PDF PhD thesis Queensland University of Technology Archived PDF from the original on 2016 11 22 Retrieved 2019 12 21 Tian Yuandong Fu Cheng 2021 01 27 Introducing N Bref a neural based decompiler framework Retrieved 2022 12 30 Rowland Diane 2005 Information technology law 3 ed Cavendish ISBN 1 85941 756 6 U S Copyright Office Copyright Law Chapter 1 Archived from the original on 2017 12 25 Retrieved 2014 04 10 The Legality of Decompilation Program transformation org 2004 12 03 Archived from the original on 2010 09 22 Retrieved 2010 09 15 Digital Millennium Copyright Act PDF US Congress 1998 10 28 Archived PDF from the original on 2013 12 10 Retrieved 2013 11 15 Federal Register Request Access Archived from the original on 2022 01 25 Retrieved 2021 01 31 Czarnota Bridget Hart Robert J 1991 Legal protection of computer programs in Europe a guide to the EC directive London Butterworths Tolley ISBN 0 40600542 7 Report from the Commission to the Council the European Parliament and the Economic and Social Committee on the implementation and effects of Directive 91 250 EEC on the legal protection of computer programs Archived from the original on 2020 12 04 Retrieved 2020 12 26 External links edit nbsp Look up decompiler in Wiktionary the free dictionary nbsp Wikibooks has a book on the topic of Reverse Engineering Decompilers and Disassemblers at Curlie Retrieved from https en wikipedia org w index php title Decompiler amp oldid 1223794106, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.