-
Notifications
You must be signed in to change notification settings - Fork 6
Memory Management
VSL has automatic memory management meaning you don't have to worry about allocating or freeing memory, the standard library and the programming language do that for you. That said, this page hopefully will give a rough understanding of how VSL approaches memory management. You don't need to know this to do anything in VSL but keeping it in the back of your mind will mean you can probably write code that VSL can optimize better and avoid anti-patterns which ripple throughout your program. Additionally, this aims to discuss the vsl analysis
tools that you can use to get super easy-to-understand idea of what parts of your code cause inefficient memory management to be used.
Does VSL use a tracing garbage collector?
No.
Does VSL use restricted semantics like Rust?
No.
Does VSL use reference counting?
Sometimes.
So VSL's memory management isn't a purist approach to the topic. It combines heavy static analysis to 'minimize' the lifetime of objects (anything that is heap allocated) and dynamic management. What does this mean? VSL attempts to structure and analyze your program to identify objects that don't need to be reference counted, and just doesn't reference count them. Additionally, items will only have complex (non-deterministic lifetimes) in certain cases. Most of these cases can be statically determined and the compiler can determine to what extent it needs to use dynamic techniques to avoid memory management methods that have significant overhead (ref counting).
Here's an example of a function which is completely contained (completely statically optimizable):
func checkIfPrime(num: Int) -> Bool {
let factorizer = Factorizer()
let factors: Int[] = factorizer.factors(num)
.map(??.value)
return factors.contains(only: [1, num])
}
As you can see in this example a number of objects are created (the Factor objects, the array, the Factorizer) but these can all be cleaned up all most as soon as they are created.
A complex case? Okay so that was a simple case but what about in concurrent applications that have large data structures representing large and complex objects (views for example?)
class Application {
let primaryView: View = View(forApp: self)
let interactor: Interactor = Interactor(forView: self.primaryView)
// within some function ...
subview.previousView = self.primaryView
self.primaryView = subView
}
As you can see there are multiple relationships here (where the interactor has a weak relation to the app) and certain items update dynamically. If view or a property of it is retained then it will prevent it from being freed. As a result the optimizer can tie the lifetime of probabilistically unmodified objects (interactor) to a static cleanup with the application lifetime however in the situation it can then be tied to subview. This essentially implements a static way of doing what a reference count would normally do.
- Introduction
- Installation
- VSL By Example
- Usage
- WASM (WebAssembly)
- The Basics
- Your First Program
- Syntax
- Concepts
- Modules
- Advanced Details
- Interop
- VSL Development
- Common Errors