Simplify your online presence. Elevate your brand.

Data Unit Calculation Onestream Community

Data Unit Calculation Onestream Community
Data Unit Calculation Onestream Community

Data Unit Calculation Onestream Community The items below detail the specific list of tasks executed for each data unit’s calculation process. as an example, the following steps are executed for a single data unit when a user selects calculate for a single entity, scenario, and time period. A data unit is used to load, clear, calculate, store, and lock data in the multi dimensional engine. with workflow channels, onestream provides the following data units.

Data Unit Calculation Sequence Ducs Onestream Community
Data Unit Calculation Sequence Ducs Onestream Community

Data Unit Calculation Sequence Ducs Onestream Community Remember, the data unit is defined by the combination of the following dimensions: cube, parent, entity, scenario, consolidation and time. your implementation partner should be able to help you with this. Ducs > data unit calculation sequence 5. always use member id instead of member name. dim acctmember as member = api.members.getmember(dimtype.account.id, “a1000”). As you can gather from this section, the data unit design is important. if any single data unit gets to 1 million intersections of data and the data unit grows, you will see an increasing impact on performance. the largest data units should never be above 2 3 million intersections. With proper analysis and design, onestream's workflow calculation definitions ensure data is calculated in a specific sequence to accommodate any dependencies, confirmation rules are applied at the desired entity level, and up to date data is available for affected entities up the reporting tree.

Data Unit Calculation Sequence Ducs Onestream Community
Data Unit Calculation Sequence Ducs Onestream Community

Data Unit Calculation Sequence Ducs Onestream Community As you can gather from this section, the data unit design is important. if any single data unit gets to 1 million intersections of data and the data unit grows, you will see an increasing impact on performance. the largest data units should never be above 2 3 million intersections. With proper analysis and design, onestream's workflow calculation definitions ensure data is calculated in a specific sequence to accommodate any dependencies, confirmation rules are applied at the desired entity level, and up to date data is available for affected entities up the reporting tree. Us2 does not define the data unit which means that there is no way for onestream to evaluate just the data with a value of “working”. instead it’ll process all of the data in the data unit even when only the “working” data is relevant. Onestream executes formulas at a specific unit of work call a data unit. this section details the order and combination of logical processes that execute for a data unit. I don't think the entity member filter is supposed to honor the order of the entities. i didn't test it, but it might even be, that the data management job is running the calcs in parallel using different processors. Warnings aside, understanding how onestream stores its data (even on the surface like this) definitely gave me a better mental model of what was going on in the background whenever i imported data, saved it through a form, or calculated it through a business rule.

Data Unit Calculation Sequence Ducs Onestream Community
Data Unit Calculation Sequence Ducs Onestream Community

Data Unit Calculation Sequence Ducs Onestream Community Us2 does not define the data unit which means that there is no way for onestream to evaluate just the data with a value of “working”. instead it’ll process all of the data in the data unit even when only the “working” data is relevant. Onestream executes formulas at a specific unit of work call a data unit. this section details the order and combination of logical processes that execute for a data unit. I don't think the entity member filter is supposed to honor the order of the entities. i didn't test it, but it might even be, that the data management job is running the calcs in parallel using different processors. Warnings aside, understanding how onestream stores its data (even on the surface like this) definitely gave me a better mental model of what was going on in the background whenever i imported data, saved it through a form, or calculated it through a business rule.

Comments are closed.