In the previous posts we have learnt about BODS Architecture , Datastore in BODS , Creating SAP BODS Repository and others. In this article we will learn about BODS Job Components and about their use in a simple BODS job.
By job components I mean the objects which will be used to create a BODS job, on the basis of their usability these objects can be divided into two categories.
- Reusable Objects, and
- Single Use Objects
As the name suggests Reusable Objects can be used multiple times, these objects have single definition and multiple calls to the object which ultimately refer to the object definition. Any change in the object definition will affect all the object calls. Some of the reusable objects are Workflow, Dataflow, Functions.
Single Use Objects can be used only once and have to be developed specifically for different job or dataflow. Example of single use objects are : Scripts, Conditional Flow, Try-Catch.
Before jumping into details of each objects let us see a typical hierarchy of a job in terms if the objects used in it.
Let us see some of the objects in detail which will be commonly used while creating BODS Jobs.
If you try to understand in a very basic terms, project is nothing but a folder which holds all types of job at one place. Sometimes a developer is working on different projects and will want to differentiate all the jobs and to do that he can create different projects and move the particular jobs to these projects.
These are the smallest executable unit, they can further be divided into two types :
- Batch Jobs, and
- Real-time Jobs
Batch jobs are the scheduled jobs and trigger at specific times to extract transform and load the data, whereas the Real-time jobs are dependent on access servers and awaits a message from the source system and as soon as any data is posted in source system a message is received by access server and using that information real time job is executed.
BODS provides a list of transformations which can be used to manipulate the data as required, in addition to these transformations we also have scripts which we can used to write custom codes to transform the data or perform other intermediate tasks. Generally these are used for assigning values to variables and execute complex SQL queries.
Workflows and Conditional
These determine the process of execution of dataflows, they consolidate the similar types of processes together and then use conditions in order to execute the complete job efficiently.
This can be referred to as brain of a job, actual data extraction, transformation and loading happens here. A typical dataflow will have source tables, transformations and target tables.
They are the objects which are involved in transforming of the data, there are various predefined transformations and any other specific requirement can be fulfilled via Script. Depending on the behavior of transforms they can be divided into 3 types.
- Data Integrator Transforms
- Data Quality Transforms
- Data Platform Transforms
These are the objects which provide the connection between source and target systems, to learn more about datastores read the previous article What is Datastore in SAP BODS ?
This was all about BODS Job Components, I hope the above explanation was simple and clear, from the next articles we will start creating BODS Jobs.