Overview of the workflow
The following sections aim to go over the main program file process_archetype_buildings.jl
and explain the high-level workflow of the ArchetypeBuildingModel.jl
. Links are provided to the detailed explanations of the different data structures and functions in the Library section for readers interested in the technical details.
Command line arguments
The process_archetype_buildings.jl
main program file has been primarily designed to be run via Spine Toolbox, but be run directly from the command line as well if necessary. Regardless, the main program is controlled using the following command line arguments:
- The
url
to a Spine Datastore containing the required input data and archetype building definitions.
Furthermore, the following optional keyword arguments can be provided:
-spineopt <>
, the url to a Spine Datastore where the produced SpineOpt input data should be written, if any.-backbone <>
, the url to a Spine Datastore where the produced Backbone input data should be written, if any.-generic <>
, the url to a Spine Datastore where the produced Generic input data should be written. This is essentially a dump of the raw ArchetypeBuildingModel.jl data structures, primarily useful for debugging purposes.results <url>
, url to a Spine Datastore where the produced baseline HVAC demand results should be written. By default, the results are written back into the input datastore aturl
.-weather <url>
, the url to a Spine Datastore into which the autogenerated weather data is imported. If not provided, weather data is written back into the input dataurl
.-save_layouts <false>
, controls whether auto-generatedbuilding_weather
layouts are saved as images. Set tofalse
by default, as this keyword exists primarily for debugging purposes, allowing visual inspection of the weather data weighting rasters.alternative <"">
, the name of the Spine Datastore alternative where the-spineopt
,-backbone
, or-generic
input data are saved. An empty string by default, resulting in Spine Toolbox automatically generating an alternative name when importing the parameter values.realization <realization>
, The name of the stochastic scenario containing true data over forecasts. Only relevant if stochastic weather and/or load data is used.
Input database tests
Next, the main program opens the input Datastore, and performs a series of tests on the input data and archetype building definitions to check that they make sense. If not, the main program will halt and display the test results and error messages for the user, in order to help them deduce what is wrong with the input data or definitions.
The input Datastore tests are handled by the run_object_class_tests
, run_parameter_tests
, and run_structure_type_tests
functions.
Process ScopeData
structs
As explained by The building_scope
definition section, the building_scope defines the geographical and statistical scope for a building_archetype. Before we can begin creating lumped-capacitance thermal models for the archetype buildings, the main program first needs to know the aggregated average basic properties of the archetype. Thus, the next step is to process and create the ScopeData
structs for all the building_scopes attached to a building_archetype via a building_archetype__building_scope relationship (scopes not attached to any archetype are not processed to save time).
The final ScopeData
structs are stored into a scope_data_dictionary
, which can be examined through the Julia REPL after the main program has finished. Alternatively, the -generic
keyword can be used to export the raw data structures to a Spine Datastore for inspection.
Process WeatherData
structs
After the main program is done with Process ScopeData
structs, the next step is to process the building_weather definitions into WeatherData
before forming the archetype building lumped-capacitance thermal models. However, since The building_weather
definition isn't mandatory, the main program first checks which building_archetype definitions lack a building_archetype__building_weather definition, and tries to fetch it automatically using the ArchetypeBuildingWeather.py sub-module. Essentially, the automatic weather generation is handled via the create_building_weather
function based on The building_archetype
definition and the processed ScopeData
.
The main program will import the automatically generated building_weather objects into the Spine Datastore at -weather <url>
, as well as link it to the appropriate building_archetype via the building_archetype__building_weather relationship. Regardless, the final WeatherData
structs are stored into a weather_data_dictionary
, which can be examined through the Julia REPL after the main program has finished. Alternatively, the -generic
keyword can be used to export the raw data structures to a Spine Datastore for inspection.
Process ArchetypeBuilding
structs
With all the pieces now in place, the main program can finally process all the data into lumped-capacitance thermal models depicting the desired synthetic average archetype buildings. This is handled by the ArchetypeBuilding
struct constructor, and takes as input The building_archetype
definition, as well as the appropriate ScopeData
and WeatherData
processed during the previous steps.
The ArchetypeBuilding
contains all the information about the final lumped-capacitance thermal model of the synthetic average archetype building, as well as the definitions used in its construction. The final ArchetypeBuilding
structs are stored into an archetype_dictionary
, which can be examined through the Julia REPL after the main program has finished. Alternatively, the -generic
keyword can be used to export the raw data structures to a Spine Datastore for inspection.
Solve the HVAC demand
After processing the ArchetypeBuilding
s, the main program will calculate a baseline/reference heating/cooling demand using the lumped-capacitance thermal models of the synthetic average archetype buildings. This is handled by the ArchetypeBuildingResults
struct and its constructor.
The final ArchetypeBuildingResults
are written back into the input Datastore, as well as stored into an archetype_results_dictionary
, which can be examined through the Julia REPL after the main program has finished. Alternatively, the -generic
keyword can be used to export the raw data structures to a Spine Datastore for inspection.
Export SpineOpt input data
If the -spineopt <url>
argument is given, the main program will attempt to convert the ArchetypeBuilding
s in the archetype_dictionary
into SpineOpt energy system model input data, and export that input data into the Spine Datastore at the given url
.
The input data creation is handled by the SpineOptInput
struct and its constructor.
Export Backbone input data
If the -backbone <url>
argument is given, the main program will attempt to convert the ArchetypeBuilding
s in the archetype_dictionary
into Backbone energy system model input data, and export that input data into the Spine Datastore at the given url
.
The input data creation is handled by the BackboneInput
struct and its constructor.
Export Generic input data
If the -generic <url>
argument is given, the main program will attempt to save the processed ArchetypeBuildingResults
in the archetype_results_dictionary
in their entirety into the Spine Datastore at the given url
. Essentially, this means saving all of the following:
- building_weather definitions, with the addition of the effective ground temperature from
ArchetypeBuildingModel.calculate_effective_ground_temperature
. - building_scope definitions with their associated
ScopeData
. - building_archetype definitions, with their associated
EnvelopeData
,BuildingNodeData
,BuildingProcessData
,LoadsData
,AbstractNode
, andAbstractProcess
.
The name "Generic" is perhaps a bit misleading, and should probably be renamed to e.g. "Raw" at some future point in time?