Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Running the analytical workload for INFLATION application will load the data directly from your local to the BDL fact db.

 

...

Step 5: Creating Data Table in Hive

Go to Home->BAPCode->utilityscripts->master->R, use the create_hive_ddl_using_spark_df.R script to generate the Hive SQL statement by adding your module name and entity name in it, which will help to create a data table in Hive. Once you run the script, a .hql file will be created in Home; open this file and copy the statement that was generated. Run it in the hive terminal.

 

View file
nameINFLATION_inflation.hql

...

Step 6: Importing Database and Dataset into Superset and creating a dashboard for the charts

...

Now that the business use and classification of the application is established, the application can be created using the UI. In AdminUI, set up the application by going to the Set-up Application tab, select Create New, and filling out the file structure. ce weSince we have just one layer file system we will have just one entity in it.

...

Step 2: Creating Meta Model

...

Next, create the ingest model in AdminUI. The first part will be defining which processors to use from SDL to FDL. These are the SDL-FDL processors we select for our Teams data. You can refer to this page to understand what each of the processors do.Image RemovedImage Removed

...

Step 4: Running Workloads

...

Once you confirm in HDFS that the SDL-FDL workload ran correctly, run the FDL-BDL workload next. This will apply the processors we selected in the Ingest Model for the FDL to BDL layer. You can see if the workload ran correctly by going to the BDL/Fact directory in HDFS.

...

Step 5: Creating Analytical Model

...

Running the analytical workload for INFLATION application will load the data directly from your local to the BDL fact db.

 

...

Step 6

...

Go to Home->BAPCode->utilityscripts->master->R, use the create_hive_ddl_using_spark_df.R script to generate the Hive SQL statement by adding your module name and entity name in it, which will help to create a data table in Hive. Once you run the script, a .hql file will be created in Home; open this file and copy the statement that was generated. Run it in the hive terminal.

 

View file
nameWEALTHINEQUALITIESRACE_wealthinequalities_race.hql

...

: Importing Database and Dataset into Superset and creating a dashboard for the charts

Once you are in Superset, select the Datasets option from the Data dropdown in the top menu. From there, select the add Dataset option. Set the Database to Apache Hive, select your database from Schema, and select which table you would like to add. Superset will only allow you to add one table at a time, but you can add as many tables as you want one by one.

...