Validation

In addition to our own rigorous testing, all of our software has been validated against a wide variety of common standards to ensure the software produces results you can rely on.

EPC & Part L2A

Tas Engineering v9.5.6 was approved for EPC and Part L 2021 for England, Wales & Scotland by DCLG in September 2023:

Tas Engineering v9.5.5 was approved for EPC and Part L 2021 for England and Wales by DCLG on 04/04/2023:

Tas Engineering v9.5.4 was approved for EPC and Part L 2021 for England by DCLG on 29/06/2022:

Tas Engineering v9.5.0 was approved for EPC and Part L 2013 for England by DCLG on 13/01/2020:

Tas Engineering v9.4.4 was approved for EPC and Part L 2013 for England by DCLG on 28/01/2019:

Tas Engineering v9.4.3 was approved for EPC and Part L 2013 for Wales by the Welsh Government on 17/05/2018:

Tas Engineering v9.4.2 was approved for EPC and Part L 2013 for England by DCLG on 21/03/2018:

Tas Engineering v9.4 was approved for EPC and Part L 2013 for England by DCLG on 21/10/2016:

Tas Engineering v9.3 was approved for EPC and part L 2013 for England by DCLG on 01/05/2014:

Tas 9.3 was approved for EPC and Part L 2014 for Wales by the Welsh Government on 02/10/2014:

ASHRAE 140-1

Tas has completed the building envelope and HVAC equipment performance tests as required by ASHRAE 140-1 (2004) for sections 5-2 and 5-3 and by ASHRAE 140-1 (2007 & 2014) for sections 5-2, 5-3A, 5-3B and 5-4.

Tas 9.5.x ASHRAE 140-1 (2007 & 2014) Envelope and HVAC equipment performance results:

Tas 9.4.x ASHRAE 140-1 (2007 & 2014) Envelope and HVAC equipment performance results:

Tas 9.3.x ASHRAE 140-1 (2007) Envelope and HVAC equipment performance results:

Tas 9.2.x ASHRAE 140-1 (2007) Envelope and HVAC equipment performance results:

Tas 9.2.x ASHRAE 140-1 (2004) Envelope and HVAC equipment performance results:

Of the results required for Section 5.2 output, Tas is out of range on only seven occasions. Although not required by 140-1 a brief explanation of these results is given here.

In test 440 Tas is out of range by only 0.5% and this test was also not completed by all packages (less comparative data). It’s also clear from the comparative results that only two packages account for radiation exiting back out through the window rigorously (test 440 has highly reflective internal surfaces meaning radiation will bounce many times increasing the loss of radiation). These three results are very tightly packed at 3.967, 3.975 and our result at 3.949 MWh.

The other six out of range tests are all free float cases where we are consistently very slightly cooler than the range allows. This is due to Tas employing a sophisticated radiation back loss model that incorporates cloud data and humidity to predict a more accurate sky temperature. We have confirmed this by modifying the back loss model to be typical and all six cases then fall within range. Because of good empirical agreement of this back loss model in the past, we prefer to retain it and fall out of range in these cases.

Of the results required for Section 5.2 output, Tas is out of range on only seven occasions. Although not required by 140-1 a brief explanation of these results is given here.

In test 440 Tas is out of range by only 0.5% and this test was also not completed by all packages (less comparative data). It’s also clear from the comparative results that only two packages account for radiation exiting back out through the window rigorously (test 440 has highly reflective internal surfaces meaning radiation will bounce many times increasing the loss of radiation). These three results are very tightly packed at 3.967, 3.975 and our result at 3.949 MWh.

The other six out of range tests are all free float cases where we are consistently very slightly cooler than the range allows. This is due to Tas employing a sophisticated radiation back loss model that incorporates cloud data and humidity to predict a more accurate sky temperature. We have confirmed this by modifying the back loss model to be typical and all six cases then fall within range. Because of good empirical agreement of this back loss model in the past, we prefer to retain it and fall out of range in these cases.

ISO Standards

Tas 9.5.2 has demonstrated compliance to the following BS EN ISO standards. Links are provided below to download compliance reports and the Tas files used in the creation of those reports.

EN ISO 13791:2012

EN ISO 13792:2012

EN ISO 15255:2007

EN ISO 15265:2007


Tas 9.3.0 has demonstrated compliance to the following BS EN ISO standards. Links are provided below to download compliance reports and the Tas files used in the creation of those reports.

EN ISO 13791:2012

EN ISO 13792:2012

EN ISO 15255:2007

EN ISO 15265:2007

BEEM

Building Energy and Environmental Modelling

The Building Energy and Environmental Modelling (BEEM) software checklist can be downloaded here:(This has been superseded by CIBSE AM11 documentation released in 2015):

CIE 171:2006

(Accuracy of Lighting Computer Programs)

The CIE Technical Report CIE 171:2006 (Test Cases to Assess the Accuracy of Lighting Computer Programs) was produced to assist developers and users of lighting analysis software to validate results. Test cases have been created using TAS Daylight and the results have been validated against the results given in the CIE report. (It can be purchased at http://www.techstreet.com/standards/cie/171_2006?product_id=1253803 )

The report states that the primary target of the test cases is physically based computer simulation such as radiosity, this being the method that TAS Daylight uses. The report covers artificial lighting scenarios as well as daylighting scenarios, although only the latter are applicable to TAS daylight. The relevant test cases all have reference values which are analytically calculated, based on physical laws. Each test case addresses a different aspect of light propagation.

The test cases that were applicable and used to validate TAS Daylight are:

  • 5.4 Luminous flux conservation – to verify that the incident luminous flux at an unglazed opening surface equals the total direct flux reaching the internal surfaces.
  • 5.5 Directional transmittance of clear glass – to assess the capability to take the directional transmittance into consideration, given that the light transmission through glass materials varies with the angle of incidence at which the light arrives at the glass surface.
  • 5.6 Light reflection over diffuse surfaces – to assess the accuracy in computing the light reflection over diffuse surfaces. This applies to surfaces within a room and also to the reflection of daylight on the external ground and shade surfaces.
  • 5.7 Diffuse reflection with internal obstructions – to verify the capability to simulate the shading influence of obstructions e.g. the daylight received through an aperture. This test is a higher level of complexity with results that are more sensitive to calculation parameters e.g. those specify the radiosity meshing.
  • 5.8 Internal reflected component calculation for diffuse surfaces – to assess the accuracy of the diffuse inter-reflections inside a room. This test requires a fully converged radiosity solution to verify the average indirect illuminance inside a room.
  • 5.9 Sky component for a roof unglazed opening and the CIE general sky types – to test the calculation of the sky component under different sky conditions, in particular the standard CIE sky types.
  • 5.10 Sky component under a roof glazed opening – to verify the simulation of the influence of glass with a given directional transmission under the different types of CIE general skies.
  • 5.11 Sky component and external reflected component for a façade unglazed opening – to verify the correct calculation of the contribution of the external ground and the sky luminance distribution to the internal illuminance of a room with a façade opening.
  • 5.12 SC+ERC for a façade glazed opening – to verify the calculation of a daylight factor under the standard CIE sky types and a glazed façade opening. (NBSC is sky component and ERC is external reflected component)
  • 5.13 SC+ERC for an unglazed façade opening with a continuous external horizontal mask – to verify the calculation simulating the influence of an external horizontal mask (i.e. shade) on the internal direct illuminance.
  • 5.14 SC+ERC for an unglazed façade opening with a continuous external vertical mask – to verify the calculation simulating the influence of an external vertical mask on the internal direct illuminance.

For each of the above test cases, the required model was created in the TAS Building Modeller and, the required reflectance and transmission values were set for the surfaces, including the ground surface. However note that it is not possible to implement the tests via the standard functionality within the user interface, and additional options were implemented. These options were made available only for the purpose of these tests as they are not applicable to the standard usage i.e. for daylight factors and in fact may cause confusion to the user if exposed.

For example it is necessary to distinguish between the light from just the sky model without the direct sun component. Another requirement is to be able to set the angles of the sun position exactly for some of the tests. Many of the tests require an unglazed opening. At present it is not possible to set a window pane via the user interface to have no glazing, either it is transparent with a transmission factor or it is opaque. For the 5.8 test a point light had to be programmatically created in a space with no windows.

It was not possible to implement some test requirements, in particular that of a uniform luminance for the ground and external masks, this being an artificial limitation. Therefore the full results of tests such as 5.11 could not be validated to the required accuracy at some points.

The tests were performed only for the CIE general sky types that are available in TAS Daylight. These are CIE Overcast and CIE Clear (Type 12).

CFD

(Computational Fluid Dynamics)

All the simulations and their results, found by following the links below, have been generated by exactly the same code (either 2D or 3D). Exactly the same algorithms have been employed with no special meshing or user intervention required, as all the problems have been specified on a completely uniform Cartesian grid. There have been no changes in the type of turbulence modelling between each problem, no changes to the differencing scheme or parameter tuning for wall functions.

In short, unlike so many other CFD products, nothing else has been done but simply draw, enter boundary values and then simulate: