Upgrade user experience through. Net nativeaot



TypedocConverterIt’s a project that I started earlier because I helped to maintain the Monaco editor uwp, but because there are too many APIs in the Monaco editor, it’s not cost-effective to write the type binding of C #.

This tool can directly generate the corresponding C # type binding code from the JSON file generated by typedoc according to typescript, and provide complete JSON serialization support. Therefore, using this tool can greatly reduce the difficulty of porting typescript library to. Net( As for why to directly parse from typedoc instead of typescript, it’s just because I’m too lazy to write typescript parser.)

Typedocconverter is written in F #. Although. Net 5 can be used to do self managed release of single file after assembly trimming, I have been thinking that it would be good if the whole program could be compiled into native binary by using AOT technology. In this way, users will not need to run. Net runtime or JIT, but will directly run machine code.

In addition to functionality, the most important thing of the tool is user experience, which will greatly improve the startup speed of the program (although it is fast enough, I want to shorten the startup time of 100ms to less than 1ms), so that users do not need to wait when using the tool.

Research on AOT scheme

. net has always had a nameCoreRTThe. Net assembly can be compiled to native binary using this tool. However, this project has not been actively maintained since 2018. However, due to the strong voice of the community and the fact that a Microsoft Partner’s project needs AOT technology, and says that. Net will no longer be used without this technology, so the project revives in situ and is transferred to the site under the name of native AOTruntimelabAnd as. Net 6P0 (highest) priority experimental work item(that is to say, the official preview with support is provided instead of the original ten thousand year alpha)win-x64linux-x64andosx-x64Support for arm64, mobile platform and web assembly is planned.

Taking this opportunity, I decided to use this solution to compile the project as a native image.

Native AOT principle

The idea of native AOT of. Net is very simple

  • First, we need an AOT friendly core library for native AOT(System.Private.CoreLib)Implementation, provide type and implementation of search, type analysis and other methods
  • Scan the assembly and record the types and methods used
  • Call ryujit interface, generate type metadata, generate code for all methods, and finally generate obj binary file
  • Call linker (MSVC or clang) to link the generated obj with GC and system library to become the final executable file

At this stage, native AOT has been basically completed, and the rest of the work is some patching and improvement, as well as the follow-up of the new version. Net (at present, there is no follow-up of features related to runtime modification after C # 8, such as default interface method implementation and module initializer, etc.).

What’s the difference between this and. Net native? The difference is that. Net native uses UTC compiler (MSVC back-end) for code generation, while native AOT uses ryujit for code generation.

For the complete documentation of. Net nativeaot, please refer to:using-native-aot

For native AOT transformation project

Native AOT is very easy to use. You only need to modify the csproj project file


IlcOptimizationPreferenceSpecify speed to generate code for maximum performance (or size to generate code for minimum program).

IlcFoldIdenticalMethodBodiesParameters can merge the same method volume, which helps to reduce the volume.

Finally, it’s newMicrosoft.DotNet.ILCompilerThis is the native AOT compiler ontology. The 6.0.0 – * version is specified by wildcard, so that the latest version will be obtained every time the compiler is compiled.

becauseMicrosoft.DotNet.ILCompilerArtifacts from the experimental repository, which are not published in the official nuget source, need to be creatednuget.configIn addition, the artifacts of the experimental warehouse are introduced as the source

So you’re done, and you start compiling:

dotnet publish -c Release -r win-x64

With it comes a lot of warnings:

AOT analysis warning IL9700: Microsoft.FSharp.Reflection.FSharpType.MakeFunctionType(Type,Type): Calling 'System.Type.MakeGenericType(Type[])' which has `RequiresDynamicCodeAttribute` can break functionality when compiled fully ahead of time. The native code for this instantiation might not be available at runtime.
AOT analysis warning IL9700: Microsoft.FSharp.Reflection.FSharpValue.MakeFunction(Type,FSharpFunc`2): Calling 'System.Type.MakeGenericType(Type[])' which has `RequiresDynamicCodeAttribute` can break functionality when compiled fully ahead of time. The native code for this instantiation might not be available at runtime.

The reason is very simple: native AOT does not support dynamic code generation at run time, but makegenerictype needs to generate type at run time, so it may not be supported.

Why is it possible? Because native AOT does not support generating new types at runtime, but it is fully supported for types that have already generated code.

Because the project is not usedSystem.Reflection.EmitIt is not used to weave IL dynamically at run timeAssembly.LoadFileSo it is compatible with native AOT.

The compilation speed was ok, only half a minute. After compiling, a 29mb exe is generated. The volume is not good enough, but run it first

> ./TypedocConverter
[Error] No input file
Typedoc Converter Arguments:
--inputfile [file]: input file
--namespace [namespace]: specify namespace for generated code
--splitfiles [true|false]: whether to split code to different files
--outputdir [path]: used for place code files when splitfiles is true
--outputfile [path]: used for place code file when splitfiles is false
--number-type [int/decimal/double...]: config for number type mapping
--promise-type [CLR/WinRT]: config for promise type mapping, CLR for Task and WinRT for IAsyncAction/IAsyncOperation
--any-type [object/dynamic...]: config for any type mapping
--array-type [Array/IEnumerable/List...]: config for array type mapping
--nrt-disabled [true|false]: whether to disable Nullable Reference Types
--use-system-json [true|false]: whether to use System.Text.Json instead of Newtonsoft.Json

It runs in an instant, and you can’t feel the start-up time (less than 1ms), which is a great experience.

But just when I was happy, I used an actual JSON file to test the function, but I reported an error:

Unhandled Exception: EETypeRva:0x013EC198(System.Reflection.MissingRuntimeArtifactException): MakeGenericMethod() cannot create this generic method instantiation because no code was generated for it: 'Microsoft.FSharp.Collections.ListModule.OfSeq(System.Collections.Generic.IEnumerable)'.
   at Internal.Reflection.Core.Execution.ExecutionEnvironment.GetMethodInvoker(RuntimeTypeInfo, QMethodDefinition, RuntimeTypeInfo[], MemberInfo) + 0x144
   at System.Reflection.Runtime.MethodInfos.NativeFormat.NativeFormatMethodCommon.GetUncachedMethodInvoker(RuntimeTypeInfo[], MemberInfo) + 0x50
   at System.Reflection.Runtime.MethodInfos.RuntimeMethodInfo.get_MethodInvoker() + 0xa1
   at System.Reflection.Runtime.MethodInfos.RuntimeNamedMethodInfo`1.MakeGenericMethod(Type[]) + 0x104

You can see the methodMicrosoft.FSharp.Collections.ListModule.OfSeq(System.Collections.Generic.IEnumerableIt’s missing.

This is because the native AOT compiler did not analyze the type through the code path, so no code was generated for the type, resulting in an error when the runtime tried to create the type because the implementation code could not be found.

Therefore, you need to instruct the compiler to generate code of the specified type and method through runtime directives by creating ard.xmlAnd introduce the project:

And then in therd.xmlWrite types and methods that need to be generated extra by the compiler in. After some trial and error, I wrote the following code:

To explain the above, name is used to specify the type,,Before and after are the full name of the type and the assembly name from which the type comes. All the basic types in. Net come fromSystem.Private.CoreLibormscorlib. Detailed format description can be referred tord-xml-format

In. Net, the compiler will specify one implementation for all the generic parameters of value types, and all the reference type parameters will share one implementation. The reason for this is obvious, because the reference type is just a pointer. Therefore, according to this feature, all reference types do not need to specify the actual type parameters, and a unified reference type is specifiedSystem.ObjectJust fine; For the value type as a type parameter, you need to indicate what type of code to generate.

After the above twists and turns, recompile and run. This time, all the functions are normal, the startup speed is very fast, and the runtime performance is also very good. Moreover, the pure static link can run without installing any runtime, and the experience is almost the same as the program written by C + +.

Program volume optimization

After the above series of operations, although the startup and running speed is very fast, the size of the generated program is 30 MB, which is still a little large. Next, optimize the program volume without sacrificing the performance of the runtime code.

First, specify trimmode as link, which can make native AOT adopt a more radical assembly tailoring scheme, and delete the unreferenced code in the code path with method granularity; In addition, I think that my program does not need internationalization support, so I can delete the useless multilingual support and its resource files.


After recompiling, the size of the generated exe is only 27 MB

Unhandled Exception: Newtonsoft.Json.JsonSerializationException: Unable to find a constructor to use for type Definitions+Reflection. A class should either have a default constructor, one constructor with arguments or a constructor marked with the JsonConstructor attribute. Path 'id', line 2, position 6.
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateNewObject(JsonReader, JsonObjectContract, JsonProperty, JsonProperty, String, Boolean&) + 0x1d1
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader, Type, JsonContract, JsonProperty, JsonContainerContract, JsonProperty, Object) + 0x2cc
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader, Type, JsonContract, JsonProperty, JsonContainerContract, JsonProperty, Object) + 0xa4
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader, Type, Boolean) + 0x26e
   at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader, Type) + 0xf8
   at Newtonsoft.Json.JsonConvert.DeserializeObject(String, Type, JsonSerializerSettings) + 0x93
   at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String, JsonSerializerSettings) + 0x2b
   at [email protected](JsonSerializerSettings, Definitions.Config, Unit) + 0x31
   at TypedocConverter!+0x83a0ca

According to the error message, we know that there is something wrong with the JSON deserialization process. The problem is thatDefinitions+ReflectionThe type is cropped out. Since I know that the target type of JSON deserialization in my own program comes from my own assembly, I don’t need to use itrd.xmlThe trouble is that you just need to tell the compiler not to tailor the types in my own assembly (this is not valid for generic class instances because generic type implementations need specialization)

Next, recompile and run, this time no problem.

The final size of the program is 27mb, which is not much smaller than 30MB, but this is normal. After all, in rd.xml written earlier, due to laziness, it passed theDynamic="Require All"All types in the f # core library are preserved. If I take it outDynamic="Require All"If so, we will eventually compile a 22 MB binary file, but we need more energy to investigate which types need to be written intord.xml

After zip compression, only 11mb is left, which I think is a good volume.

Of course, it should be noted that under windows, the debug symbol file is provided as a separate pdb file by default, while under * INX, the debug symbol is directly embedded into the program binary data. Therefore, in non windows platforms, you need to use the strip command to cut out the symbol, otherwise you will get a very large binary program file.

strip ./TypedocConverter

If you want to see the final effect, you can download the release file with native name herehttps://github.com/hez2010/TypedocConverter/releases

Some people here may wonder how small native AOT can be? Through experiments, the size of the Hello world project can be less than 1MB after the reflection is disabled and all root assemblies are canceled.

Known issues and limitations

It is expected that. Net native AOT will provide a preview with support in. Net 6 (in fact, it is stable enough). At this stage, there are some known problems affecting the use, which I will list here.

It is not supported due to lack of implementation (mainly features that need to be changed at runtime after C # 8), but it will be solved in a short time

  • Default interface method implementation with generic methods is not supported
  • Covariant return is not supported
  • Not supported in try catch statementscatch (T)That is, the generic parameter is used as the exception type of catch
  • Module initializers are not supported

Problems that will not be solved in the short term:

  • COM is not supported
  • C + + / cli is not supported

Limited by the fact that the runtime has no JIT and cannot be implemented:

  • Run time dynamic code generation (such as:System.Reflection.Emit
  • Run time dynamic loading of assemblies (such as:Assembly.LoadFile
  • Infinite generic recursive call

Some people may not understand what is called infinite generic recursive call. Let me explain it through the code. If you write the following code:

public void Foo()
    if (bar)

This causes the compiler stack overflow. The reason is because the code willUType inTIf it is called without changing the number of nesting levels of generic types (for example, callingUBring inT), just specify the type to be used through rd.xml; However, when the number of nested layers is inconsistent, the compiler does not know how many layers of code you will expand at compile time (native AOT compiler needs to expand all generics and generate code for all methods and types involved at compile time), so it will generate unlimited code forTUU>… code, resulting in compilation failure. Why is there no problem with JIT? Because it can be based onbarThis condition generates types and code on demand at run time.

I have fixed similar problems for reactviex and Entity Framework core. For details, please refer to:

GUI solution

Because it doesn’t support com and C + + / cli in a short time, it means that WPF can’t be compiled into native program through native AOT, but the good thing is that the cross platform (based on skia self drawing) implementation version of WPFAvaloniaCOM is not required at all, and it does not contain the known problems listed above, so it can be used to develop cross platform UI programs today.

Due to a lot of optimizations in version 0.10.0 and the introduction of compile time binding, the performance has been greatly improved, and all animations are presented in 60fps, with a theme library of fluent design, which makes the experience very comfortable. After trying, I compiled my visual general traveling salesman problem solver application with native AOT and got a 40MB application (without runtime), which can be started instantaneously and takes less than 20MB of memory at runtime. What is small and beautiful (Tactical backward).


On the left side is a line chart with nearly 700000 nodes, which can experience 60 FPS (in fact, it can be higher, but for desktop GUI applications, 60 FPS rendering is a default setting), slide, zoom and track points at will, without a bit of Karton (ecart implemented by a webgl has stopped thinking at this time).

Web Solutions

Naturally, asp.net core supports nativeaot (except view in MVC). However, Entity Framework core does not support nativeaot because it uses the default interface method with generics, which will be solved with the update of nativeaot compiler and library.

Dapper, which relies heavily on run-time IL weaving, may never support native AOT. After all, bear paw and fish can’t have it both.

Of course, it is a solution to convert dynamic code generation into static code generation through source generator.

However, for asp.net core, there is one thing to note: the framework loads controller through reflection assembly, so there is no code directly referring to controller type in the code path, and all controllers will be clipped at compile time, resulting in 404 access to all APIs. This problem is also solved by writingrd.xmlTell the compiler to reserve the type to resolve.

I will own one that doesn’t use ORM, just usesMicrosoft.Data.SqliteAfter native AOT compilation, we get a 30MB program, which can provide services instantly after running. The memory consumption only needs 20MB, and the first request only needs 0.7ms. The experience is very good. This means that in the cloud native environment, especially during capacity expansion, the applications in the new nodes can be started and put into use in a very short time (less than a second), rather than waiting for the response of health check after they have been started for a short time. What is preheating? It doesn’t exist!

Summary and Prospect

There is no doubt that native AOT can greatly improve the startup speed and running performance of. Net programs, and has its own anti cracking properties, so as to truly achieve the efficiency of C # and C + + writing. In today’s.net 5 era, the development of this tool chain has been relatively mature. If you want to use it, you can experience it in advance. In fact, there are examples of using this tool chain in production projects abroad.

In addition, the technology can also be used to compile native DLL for other languages (such as C + +), and even can be used to build EFI bare metal bootloader for system programming (refer to GitHub project zerosharp).

. net nativeaot is still exploring various possibilities, one of which I think is more interesting:

In native AOT compilation, IL is first compiled to llvm IR with ryujit, which optimizes the code with IL specific patterns; Then llvm IR is compiled into native binary program. This process will be further optimized by llvm to make the compiled volume smaller and run-time performance stronger.

Previous experiments compiled to llvm IRLLILCThe problem is that the direct target to llvm IR leads to the lack of optimization of ryujit for IL specific mode. In the new experiment, ryujit, as the “middle end”, optimizes the specific pattern of IL and then sends it to llvm to avoid this deficiency.

In the future, the. Net native AOT technology will also be brought to the mobile platform and browser (webassembly). I will also pay long-term attention to and follow up the future development of this technology.

Finally, I hope the. Net platform will be better and better.