Julia (programming language)


Julia is a high-level, high-performance, dynamic programming language. While it is a general purpose language and can be used to write any application, many of its features are well-suited for numerical analysis and computational science.
Distinctive aspects of Julia's design include a type system with parametric polymorphism in a dynamic programming language; with multiple dispatch as its core programming paradigm. Julia supports concurrent, parallel and distributed computing, and direct calling of C and Fortran libraries without glue code. Julia uses a just-in-time compiler that is referred to as "just-ahead-of-time" in the Julia community, as Julia compiles to machine code before running it.
Julia is garbage-collected, uses eager evaluation, and includes efficient libraries for floating-point calculations, linear algebra, random number generation, and regular expression matching. Many libraries are available, including some that were previously bundled with Julia and are now separate.
Several development tools support coding in Julia, such as integrated development environments ; with integrated tools, e.g. a linter, profiler, debugger, and the Rebugger.jl package "supports repeated-execution debugging" and more.

History

Work on Julia was started in 2009, by Jeff Bezanson, Stefan Karpinski, Viral B. Shah, and Alan Edelman, who set out to create a free language that was both high-level and fast. On 14 February 2012, the team launched a website with a blog post explaining the language's mission. In an interview with InfoWorld in April 2012, Karpinski said of the name "Julia": "There's no good reason, really. It just seemed like a pretty name." Bezanson said he chose the name on the recommendation of a friend.
Since the 2012 launch, the Julia community has grown, with over 13,000,000 downloads as of 2020, The Official Julia Docker images, at Docker Hub, have seen over 4,000,000 downloads as of January 2019. The JuliaCon academic conference for Julia users and developers has been held annually since 2014.
Version 0.3 was released in August 2014, version 0.4 in October 2015, version 0.5 in October 2016, and version 0.6 in June 2017. Both Julia 0.7 and version 1.0 were released on 8 August 2018. Work on Julia 0.7 was a "huge undertaking", and some changes were made to semantics, e.g. the iteration interface was simplified; and the syntax changed a little.
The release candidate for Julia 1.0 was released on 7 August 2018, and the final version a day later. Julia 1.1 was released in January 2019 with, e.g., a new "exception stack" language feature. Bugfix releases are expected roughly monthly, for 1.4.x and 1.0.x and Julia 1.0.1 up to 1.0.5 have followed that schedule. Julia 1.2 was released in August 2019, and it has e.g. some built-in support for web browsers, and Julia 1.4 in March 2020. Julia 1.3 added e.g. composable multi-threaded parallelism and a binary artifacts system for Julia packages.
Julia 1.4 allowed better syntax for array indexing to handle e.g. 0-based arrays, with for the second element of array A. The memory model was also changed. Minor release 1.4.2 fixed e.g. a Zlib issue, doubling decompression speed.
Yet to be released Julia 1.5 adds record and replay debugging support, for Mozilla's rr tool. It can be used manually in earlier versions, without Julia's help.
Most packages that work in Julia 1.0.x also work in 1.1.x or newer, enabled by the forward compatible syntax guarantee. A notable exception was foreign language interface libraries like JavaCall.jl and Rcall.jl due to some threading-related changes. The issue is especially complicated for Java's JVM, as it has some special expectations around how the stack address space is used. A workaround called has been posted for Julia 1.3.0, while a full fix for Java is pending and has no set due date. In addition, JVM versions since Java 11 do not exhibit this problem.

Notable uses

Julia has attracted some high-profile users, from investment manager BlackRock, which uses it for time-series analytics, to the British insurer Aviva, which uses it for risk calculations. In 2015, the Federal Reserve Bank of New York used Julia to make models of the United States economy, noting that the language made model estimation "about 10 times faster" than its previous MATLAB implementation. Julia's co-founders established Julia Computing in 2015 to provide paid support, training, and consulting services to clients, though Julia remains free to use. At the 2017 JuliaCon conference, Jeffrey Regier, Keno Fischer and others announced that the Celeste project used Julia to achieve "peak performance of 1.54 petaFLOPS using 1.3 million threads" on 9300 Knights Landing nodes of the Cori II supercomputer. Julia thus joins C, C++, and Fortran as high-level languages in which petaFLOPS computations have been achieved.
Three of the Julia co-creators are the recipients of the 2019 James H. Wilkinson Prize for Numerical Software "for the creation of Julia, an innovative environment for the creation of high-performance tools that enable the analysis and solution of computational science problems." Also, Alan Edelman, professor of applied mathematics at MIT, has been selected to receive the 2019 IEEE Computer Society Sidney Fernbach Award "for outstanding breakthroughs in high-performance computing, linear algebra, and computational science and for contributions to the Julia programming language."
Julia Computing and NVIDIA announce "the availability of the Julia programming language as a pre-packaged container on the NVIDIA GPU Cloud container registry" with NVIDIA stating "Easily Deploy Julia on x86 and Arm Julia offers a package for a comprehensive HPC ecosystem covering machine learning, data science, various scientific domains and visualization."
Additionally, "Julia was selected by the as the sole implementation language for their next generation global climate model. This multi-million dollar project aims to build an earth-scale climate model providing insight into the effects and challenges of climate change."
Julia is e.g. used by NASA; and Brazilian equivalent for space mission planning/satellite simulation.

Sponsors

Julia has received contributions from over 870 developers worldwide. Dr. Jeremy Kepner at MIT Lincoln Laboratory was the founding sponsor of the Julia project in its early days. In addition, funds from the Gordon and Betty Moore Foundation, the Alfred P. Sloan Foundation, Intel, and agencies such as NSF, DARPA, NIH, NASA, and FAA have been essential to the development of Julia. Mozilla, the maker of Firefox Web browser, with its research grants for H1 2019, sponsored "a member of the official Julia team" for the project "Bringing Julia to the Browser", meaning to Firefox and other web browsers.

Julia Computing

Julia Computing, Inc. was founded in 2015 by Viral B. Shah, Deepak Vinchhi, Alan Edelman, Jeff Bezanson, Stefan Karpinski and Keno Fischer.
In June 2017, Julia Computing raised $4.6M in seed funding from General Catalyst and Founder Collective, and in the same month was "granted $910,000 by the Alfred P. Sloan Foundation to support open-source Julia development, including $160,000 to promote diversity in the Julia community" and in December 2019 the company got $1.1M funding from the US government to "develop a neural component machine learning tool to reduce the total energy consumption of heating, ventilation, and air conditioning systems in buildings".

Language features

Though designed for numerical computing, Julia is a general-purpose programming language.
It is also useful for low-level systems programming, as a specification language, and for web programming at both server and client side.
According to the official website, the main features of the language are:
Multiple dispatch is a generalization of single dispatch the polymorphic mechanism used in common object-oriented programming languages that uses inheritance. In Julia, all concrete types are subtypes of abstract types, directly or indirectly subtypes of the Any type, which is the top of the type hierarchy. Concrete types can not themselves be subtyped the way they can in other languages; composition is used instead.
Julia draws significant inspiration from various dialects of Lisp, including Scheme and Common Lisp, and it shares many features with Dylan, also a multiple-dispatch-oriented dynamic language, and with Fortress, another numerical programming language. While Common Lisp Object System adds multiple dispatch to Common Lisp, not all functions are generic functions.
In Julia, Dylan, and Fortress, extensibility is the default, and the system's built-in functions are all generic and extensible. In Dylan, multiple dispatch is as fundamental as it is in Julia: all user-defined functions and even basic built-in operations like + are generic. Dylan's type system, however, does not fully support parametric types, which are more typical of the ML lineage of languages. By default, CLOS does not allow for dispatch on Common Lisp's parametric types; such extended dispatch semantics can only be added as an extension through the CLOS Metaobject Protocol. By convergent design, Fortress also features multiple dispatch on parametric types; unlike Julia, however, Fortress is statically rather than dynamically typed, with separate compiling and executing phases. The language features are summarized in the following table:
LanguageType systemGeneric functionsParametric types
JuliaDynamicDefaultYes
Common LispDynamicOpt-inYes
DylanDynamicDefaultPartial
FortressStaticDefaultYes

By default, the Julia runtime must be pre-installed as user-provided source code is run. Alternatively, a standalone executable that needs no Julia source code can be built with ApplicationBuilder.jl and PackageCompiler.jl.
Julia's syntactic macros, like Lisp macros, are more powerful than text-substitution macros used in the preprocessor of some other languages such as C, because they work at the level of abstract syntax trees. Julia's macro system is hygienic, but also supports deliberate capture when desired using the esc construct.

Interaction

The Julia official distribution includes an interactive command-line read–eval–print loop, with a searchable history, tab-completion, and dedicated help and shell modes, which can be used to experiment and test code quickly. The following fragment represents a sample session example where strings are concatenated automatically by println:

julia> p = 2x^2 + 1; f = 1 + 2py
julia> println
Hello world! I'm on cloud 9 as Julia supports recognizable syntax!

The REPL gives user access to the system shell and to help mode, by pressing ; or ? after the prompt, respectively. It also keeps the history of commands, including between sessions. Code that can be tested inside the Julia's interactive section or saved into a file with a .jl extension and run from the command line by typing:

$ julia

Julia is supported by Jupyter, an online interactive "notebooks" environment.

Use with other languages

Julia is in practice interoperable with many languages. Julia's ccall keyword is used to call C-exported or Fortran shared library functions individually, and packages to allow calling other languages, to call e.g. Python, R, MATLAB, Java or Scala, do that indirectly for you. And packages for other languages, e.g. Python, i.e. pyjulia, to call to Julia do too.
Julia has support for Unicode 12.1, with UTF-8 used for strings and for Julia source code, meaning also allowing as an option common math symbols for many operators, such as ∈ for the in operator.
Julia has packages supporting markup languages such as HTML, XML, JSON and BSON, and for databases and Web use in general.

Package system

Julia has a built-in package manager and includes a default registry system. Packages are most often distributed as source code hosted on GitHub, though alternatives can also be used just as well. Packages can also be installed as binaries, using artifacts. Julia's package manager is used to query and compile packages, as well as managing environments. Federated package registries are supported, allowing registries other than the official to be added locally.

Uses

Julia has been used to perform petascale computing with the Celeste library for sky surveys. Julia is used by BlackRock Engineering analytical platforms.

Implementation

Julia's core is implemented in Julia and C, together with C++ for the LLVM dependency. The parsing and code-lowering are implemented in FemtoLisp, a Scheme dialect. The LLVM compiler infrastructure project is used as the back end for generation of 64-bit or 32-bit optimized machine code depending on the platform Julia runs on. With some exceptions, the standard library is implemented in Julia. The most notable aspect of Julia's implementation is its speed, which is often within a factor of two relative to fully optimized C code. Development of Julia began in 2009 and an open-source version was publicized in February 2012.

Current and future platforms

While Julia uses JIT, Julia generates native machine code directly, before a function is first run.
Julia has four support tiers. All 32 or 64-bit x86 processors newer than the i686 or the original amd64 are supported. ARMv8 processors are fully supported, and ARMv7 and ARMv6 are supported with some caveats. CUDA has tier 1 support, with the help of an external package. There are also additionally packages supporting other accelerators, such as Google's TPUs, and AMD's GPUs also have support with e.g. OpenCL; and experimental support for the AMD ROCm stack. Julia's downloads page provides executables for all the officially supported platforms.
On some platforms, Julia may need to be compiled from source code, with specific build options. Julia has been built
on several ARM platforms. PowerPC has tier 3 support, meaning it "may or may not build".
Julia is now supported in Raspbian while support is better for newer Pis, e.g., those with ARMv7 or newer; the Julia support is promoted by the Raspberry Pi Foundation.
There is also support for Web browsers/JavaScript through JSExpr.jl; and the alternative language of Web browsers, WebAssembly, has minimal support for several upcoming external Julia projects.
Julia can compile to ARM; thus, in theory, Android apps can be made with NDK, but for now Julia has been made to run under Android only indirectly, i.e. with a Ubuntu chroot on Android.