Welcome to AssignmentCache!

Search results for 'CIS355A FINAL PROJECT GUI APPLICATION'

Items 1 to 10 of 429 total

per page
Page:
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5

Grid  List 

Set Descending Direction
  1. DBM/502 Week 3 sql statements and screenchot

    DBM 502 Week 3 Data Population, Manipulation, and Retrieval

    $15.00

    Individual Assignment Database Creation and Reporting Resources: The entity relationship diagram and normalized table specifications from the Week Two individual assignment, SQL Server® and Reporting Services on the Toolwire® site, and Beginning Microsoft® SQL Server® 2008 Programming Based on your Week #2 IA’s table design metadata, program your own SQL statements using SQL Server’s Management Studio to create, populate, and query a relational database for the Huffman Trucking’s fleet truck maintenance application. (3 points) Use CREATE TABLE statements to create a table for each entity for Huffman Trucking’s fleet truck maintenance database. (2 points) Use INSERT statements to populate the tables with realistic sample data. Include at least 2 rows for each table. (2.5 points; 0.5 point each) Use a separate SELECT statement to create each of the following queries: • A simple query for each table that returns all of the columns and all of the rows for each table. • A query that displays each part that has been purchased by Huffman Trucking Company. For each part, also retrieve its parts catalog information from the parts catalog table and vendor information from the vendor table. • A query that displays all of the rows in the vehicle maintenance table. For each vehicle maintenance row, join the corresponding information from the maintenance descriptions table and vehicles table. • A query that displays each row in the maintenance work order table. For each row in the maintenance work order table, join the corresponding information from the maintenance description table. • A query that counts the number of maintenance work orders for each vehicle in the maintenance work order table. Display the vehicle column and the corresponding count of work orders for each vehicle. (2.5 points) Create a report using the SQL Server Business Intelligence Reporting Services of the parts purchasing history for Huffman Trucking Company. The report should display all parts purchasing history including manufacturer and vendor information, parts catalog information, and associated parts inventory issues, and all parts inventory purchases. Format the report to present to senior management at Huffman Trucking. Place screen snapshots of all of the above SQL statements as they executed in SQL Server Management Studio showing the SQL statements and the results returned, as well as a screen snapshot of your Reporting Services report, showing your actual data, into a single Word or .pdf file Learn More
  2. DBM 502 Week 4 Data Warehouses BI Presentation

    DBM 502 Week 4 Data Warehouses BI Presentation

    $15.00

    Individual Assignment BI Presentation Resources: Huffman Trucking in the Virtual Organizations; the entity relationship diagram, tables, and sample database from the Week Two and Week Three individual assignments Prepare a Microsoft PowerPoint Presentation for senior management at Huffman Trucking explaining how integrating the fleet truck maintenance information into the corporate data warehouse will support organizational goals. Explain how business intelligence may improve efficiencies and fleet performance. Provide a detailed example of a data-mining technique or application and how it will provide information useful to management. · Describe a data-mining technique from Table 9-4 of Modern Database Management or a data-mining application from Table 9-5 of Modern Database Management. · Identify a specific use of this data-mining tool with the information from the sample database from the Week Four individual assignment. · Identify the attributes that this data-mining tool will access. · Explain how the information gathered using this data-mining tool will support Huffman Trucking’s organizational goals. Support your arguments with at least four peer-reviewed or industry publications. Present your recommendation in a 6- to 8-slide Microsoft® PowerPoint® presentation. Keep your slides uncluttered with at most 5 bullets and 7 words per bullet. Your must include speaker notes in your presentation. Learn More
  3. DBM 502 Week 5 Mountain View Community Hospital Case Study

    DBM/502 Week 5 Data and Database Administration

    $20.00

    Individual Assignment Database Paper Choose one of the following assignment options to complete: Option 1: Data Governance, Quality, Integration, and Security in your organization Review data management and information security practices where you work or at an organization with which you are familiar. Recommend three specific policy changes to improve data governance, quality, integration, and security. Support each recommendation with evidence from at least two scholarly or trade publications. Option 2: Mountain View Community Hospital Case Study Review the Case sections of Ch. 10 and Ch. 11 in Modern Database Management. Analyze the data governance, quality, integration, and security at Mountain View Community Hospital. Address the strengths and opportunities for improvement in data management and security. Support your arguments with evidence from scholarly or trade publications. For either option chosen, write a 1,050- to 1,400-word paper consistent with APA guidelines. Learn More
  4. DBM 380 Week 2 Database Environment Paper

    DBM/380 Week 2 Database Design

    $15.00

    Individual Database Environment Paper Write a 750- to 1,050-word paper in which you complete the following: · Choose a database environment from the following: o An appropriate database environment within your workplace (must be approved by your instructor) o An Art Museum that needs to track the artwork, artists, and locations where the art is displayed or stored within the museum o Smith Consulting (Virtual Organization) – needs a database to track their consulting staff, each staff member’s skill sets, and what projects they are working on · Analyze the database environment. · Describe the problems and constraints. · Describe the objectives of the database environment. · Describe the scope and boundaries. · List the data specifications (must include a minimum of three entities with attributes). Include 3 to 5 references. Format your paper consistent with APA guidelines. · Note. The database environment chosen will be used in the Weeks Three and Four Individual Assignments. Learn More
  5. DBM 380 Week 3 Art Museum Access Database

    DBM 380 Week 3 Entity Relationship Diagram

    $15.00

    Individual ERD Creation Project The following assignment is based on the database environment chosen and discussed in the Week Two Individual Assignment. Use a Microsoft® Visio® diagram to create a detailed ERD using the data specifications noted in the Week Two Individual Assignment. Make any necessary changes provided in your faculty’s feedback. · Use a Microsoft® Access® database to create the preliminary database tables, columns with data types, primary keys, and relationships. Learn More
  6. DBM 380 Week 4 Art Museum Nomalized Access Database

    DBM 380 Week 4 Normalization of ERD

    $15.00

    Individual Normalization of the ERD The following assignment is based on the database environment chosen and created in the Week Three Individual Assignment. Your database project must meet the following assessment requirements: Design and develop a database using professional principles and standards. · Provide a logical and physical design of the database. · Use relational database software application to develop database. · Provide an entity relationship diagram. · Normalize the database. · Generate and provide test data. Use a Microsoft® Visio® diagram to normalize the ERD to third normal form (3NF). Use the Microsoft® Access® database created in Week Three to create a minimum of 10 rows of test data in each table. Also, create at least one query that joins two tables and returns values from both tables. Note. Only the Microsoft® Visio® diagram must be normalized to the 3NF. The 3NF is not required for a Microsoft® Access®database. Submit the ERD and final database to the appropriate Assignment link. Learn More
  7. DBM 449 Lab 3 Sql File

    DBM 449 Lab 3 Distributed Database

    $20.00

    L A B O V E R V I E W    

    Scenario/Summary
    To the end user working with databases distributed through out a company's network is not different than working with multiple tables within a single database. The fact that the different databases exist in other locations should be totally transparent to the user. For this lab we are going to take on the roll of a database administrator in a company that has three regional offices in the country. You work in the central regional office, but there is also a West Coast Region located in Seattle and an East Coast Region located in Miami. Your roll is to gather report information from the other two regions.

    For this lab you are going to work with three different databases. You already have your own database instance. You will also be working with the a database named SEATTLE representing the West Coast Region and a database named MIAMI representing the East Coast Region. Login information for these two additional database instances is as follows:

    SEATTLE: Userid - seattle_user
    Password - seattle
    Host String - seattle

    MIAMI: Userid - miami_user
    Password - miami
    Host String - miami

    To record your work for this lab use the LAB3_Report.doc found in Doc Sharing. As in your previous labs you will need to copy/paste your SQL statements and results from SQL*Plus into this document. This will be the main document you submit to the Drop Box for Week 3.

    L A B S T E P S    
    STEP 1: Setting up Your Environment

    1. Be sure you are connected to the DBM449_USER schema that was created in lab 1. 
    2. To begin this lab you will need to download the LAB3_DEPTS.SQL script file associated with the link and run the script in your DBM449_USER schema of your database instance. This script contains a single table and that you will be using to help pull data from each of the other two database instances.  Notice that the DEPTNO column in this table is the PRIMARY KEY column and can be used to reference or link to the DEPTNO column in the other two database employee tables.
    3. Now you need to create a couple of private database links that will allow you to connect to your other two regional databases. To accomplish this use the connection information listed above in the Lab Overview section. Name your links using your database instance name together with the region name as the name for the link. Separate the two with an underscore (example - DB1000_SEATTLE).
    4. After creating both of your database links, query the USER_DB_LINKS view in the data dictionary to retrieve information about your database links.  The output from your query should look similar to what you see below.  You will need to set your linesize to 132 and format the DB_LINK and HOST columns to be only 25 bytes wide to get the same format that you see.

    DB_LINK                   USERNAME                       HOST                      CREATED
    ------------------------- ------------------------------ ------------------------- ---------
    DB1000_MIAMI              MIAMI_USER                     miami                     09-DEC-08

    STEP 2: Testing your Database Links
    Each of your remote databases has an employee data table. The tables are named SEATTLE_EMP and MIAMI_EMP respective to the database they are in. Using the appropriate database link, query each of the two tables to retrieve the employee number, name, job function, and salary. (HINT: you can issue a DESC command on each of the distributed tables to find out the actual column names just like you would for a table in your own instance.

    STEP 3: Connecting Data in the Seattle Database
    Write a query that will retrieve all employees from the Seattle region who are salespeople working in the marketing department. Show the employee number, name, job function, salary, and department name (HINT: The department name is in the DEPT table) in the result set.

    STEP 4: Connecting Data in the Miami Database
    Write a query that will retrieve all employees from the Miami region who work in the accounting department. Show the employee number, name, job function, salary, and department name (HINT: The department name is in the DEPT table) in the results set.

    STEP 5: Connecting Data in all Three Databases
    Now we need to increase our report. Write a query that will retrieve employees from both the Seattle and Miami regions who work in sales. Show the employee number, employee name, job function, salary and location name in the result set (HINT: The location name is in the DEPT table).

    STEP 6: Improving Data Retrieval from all Three Databases
    Writing queries like the ones above can be fairly cumbersome. It would be much better to be able to pull this type of data as though it was coming from a single table, and in fact this can be done by creating a view.

    1. Using the query written above as a guide, write and execute the SQL statement that will create a view that will show all employees in both the Seattle and Miami regions (you can use your own naming convention for the view name). Show all the employee number, name, job, salary, commission, department number and location name for each employee (HINT: The location name is in the DEPT table).
    2. Now write a query that will retrieve all the data from the view just created.

    Deliverables
    Submit your completed Lab 3 Report to the Dropbox. Your report should contain copies of each query and result set outlined in the lab along with the requested explanation of whether or not it satisfied the business requirement outlined for that particular section of the lab.

    Learn More
  8. DBM 449 Lab 5 SQL Audit and Profile Management

    DBM 449 Lab 5 Audit and Profile Management

    $20.00

    In your lab for this week you are going to work with three different areas and processes within the Oracle Database that can be used to control data security. Each of these three processes has its own distinctive application to providing levels of security. In each case the individual processes deal with either limiting a users access to the database, limiting access to processes within the database, or keeping track of what the user is doing while in the database.

    For the lab you will be using the SCOTT user which is already created in your instance. In Step 4 you will also be asked to shutdown you instance, make some edits to the init.ora file for your instance and then restart the instance. If you are not comfortable with this process which was first introduced to you in DBM438 the refer to the iLab Manual found in week 1 for guidance.

    To record your work for this lab use the LAB5_Report.doc found in Doc Sharing. As in your previous labs you will need to copy/paste your SQL statements and results from SQL*Plus into this document. This will be the main document you submit to the Dropbox for Week 5.

    LAB STEPS

    STEP 1: Define a New Profile

    Oracle provides the ability to set expirations, limit the reuse, and define the complexity of passwords. In addition, accounts can be locked if the password is entered incorrectly too many times. In this section of the lab we are going to create a custom profile that will then be applied to the SCOTT user.

    1. To begin, log into your instance as the SYS user.
    2. Write SQL script that will create a new profile named DBM449_SCOTT_PROFILE that will do the following:
      • Limit the number of failed login attempts to 3 in a row.
      • Limit the overall connection time to 10 hours (we will give him a little leeway incase he has to work overtime).
      • Allow a session to be idle no more than 1 hour.
      • Change the password every 60 days.
      • Allow the user 3 days to change the password after it expires.
      • Not allow a previous password be reused before there have been three password changes.
    3. Execute your pfile script and verify that the profile has been created by running a query against the DBA_PROFILES view in the data dictionary. Limit your output to ONLY the DBM449_SCOTT_PROFILE parameters.

    Be sure to copy/paste your script and results sets output to the appropriate section in the Lab5_report document.

    STEP 2: Testing the New Profile

    Now that we have a new profile for the SCOTT user we need to verify that it works properly. For obvious reasons there are going to be parts of the profile that we cannot test within the confines of this lab due to time constraints, but we can test to verify that the SCOTT user is being controlled by the profile.

    1. The first thing we need to do is assign the profile to the SCOTT user. While still logged into your instance as the SYS user write and execute the SQL command that will assign the new SBM449_SCOTT_PROFILE profile to the SCOTT user.
    2. Now log into SCOTT (password is TIGER). Remember that you must supply the database instance name when logging in from the SQL> prompt just as you do when using the login window, i.e. CONN SCOTT/TIGER@DB####.WORLD.
    3. There are several things that we can test related to the logging in and changing a password so here we go.
      • You should now be successfully connect to the SCOTT user. Write the connect command again on this time use an incorrect password. NOTE: you should get a warning message stating that you are no longer connected to Oracle. That is fine, just keep trying to log in.
      • Repeat the above process until you get the ORA-28000: the account is locked error which will indicate that the profile is working here.
      • At this point we need to get the account unlocked so you will need to login to your instance as the SYS user and unlock the SCOTT account BUT DO NOT LOG BACK INTO THE SCOTT USER YET.
      • Now we can test the password reuse parameter. To do this we must EXPIRE the current password. Write and execute the SQL command to expire the password for the SCOTT user.
      • Now log back into the SCOTT user. You should receive a message stating that the password has expired (ORA-28001: the password has expired) and then prompting you to change the password.
      • Try to reuse the TIGER password. You should receive the following - ORA-28007: the password cannot be reused.
    4. Now log into the SCOTT user again and this time change the password to LION to complete this step of the lab.

    Be sure to copy/paste your script and results sets output to the appropriate section in the Lab5_report document.

    STEP 3: Using the PRODUCT_USER_PROFILE table

    As the owner of a schema a user has certain inherited privileges that would allow the user to pass access to his/her own objects on to other users. Often times this can open up data to scrutiny by individuals who probably do not need to have access to it. These types of decisions should always be made by the DBA in charge of the database. One mechanism the DBA has to keeping users from using these inherited privileges is by excluding those commands using the PRODUCT_USER_PROFILE (PUP) table. In this section of the lab we are going to do this to the SCOTT user by setting up the scenario that will prohibit him from giving the user GEORGE (created in lab 2) access to the EMP table.

    1. For this section and remainder of the lab you must have the PRODUCT_USER_PROFILE successfully loaded and accessible in your instance. The creation of this profile was one of the first things done back in Lab 1 when you ran the PUPBLD.SQL script. If you are getting an error message stating "Error accessing PRODUCT_USER_PROFILE" when you log in as the DBM449_USER or the SCOTT user then this profile is not successfully installed. Work with your instructor to figure out why your script from Lab 1 did not work correctly. Until this is resolved you will not be able to complete the remainder of the lab.
    2. If you have the PRODUCT_USER_PROFILE successfully working then log in to your database instance as the SYS user.
    3. Now we need to limit SCOTT from being able to use the GRANT command.
      • Insert the proper values into the PRODUCT_USER_PROFILE table that will keep the SCOTT user from using the GRANT command. Remember that some of the values in your insert statement must be in upper case and some will need to be in mixed case. Once you have done this then query the table to verify the insert (REMEMBER: you cannot query the table as the SYS user, only as the SYSTEM user).
      • Now we need to test our above settings and make sure they are working.
      • Connect to the SCOTT user (remember that you changed the password to LION).
      • Write and execute the statement that would GRANT the user GEORGE the ability to write a select statement and see the data in the EMP table owned by SCOTT. You should receive the following message - SP2-0544: Command "grant" disabled in Product User Profile.
    4. This verifies that you have now disabled the ability of the SCOTT user to allow another user to access any of the data in his schema.

    Be sure to copy/paste your script and results sets output to the appropriate section in the Lab5_report document.

    STEP 4: Setting up the Database to use Auditing

    Being able to audit what, when and where people are doing things in the database can be a very enlightening thing for a DBA. It can also be a very important tool in working with Data Security. Oracle provides the ability to do various types of auditing, but it takes some special setting up of the environment for this to work. In this step we are going to make the necessary adjustments to the current Oracle instance so that we can enable auditing and make some tests. If you need to review the processes to be used here then refer to the iLab Manual in week 1.

    1. First you need to make sure that you are logged into your instance as the SYS user.
    2. At this point issue a SHUTDOWN IMMEDIATE command to shut down you database instance.
    3. Once the instance is shut down you need to go into your Citrix Windows Explorer application, find your database instance set of directory folders, drill down to the pfile directory folder and open your init.ora file found in that folder.
    4. Under the section titled "Security and Auditing" you need to add the parameter AUDIT_TRAIL and set the parameter to DB_EXTENDED. This will allow the SQL_TEXT column of the DBA_AUDIT_OBJECT view to be populated. Save and close the file and then go back to your SQL*Plus session.
    5. Now using the init.ora file, start your instance back up to an OPEN status. You can do this by issuing a STARTUP PFILE= statement and pointing to your init.ora file.
    6. Once you have completed this process you are ready to begin setting up the database to audit some activity.

    Be sure to copy/paste your script and results sets output to the appropriate section in the Lab5_report document.

    STEP 5: Creating an Audit Trail

    Oracle permits audit trails to be generated for session login attempts, access to objects, and activity performed on objects. Again using the SCOTT user we are going to set up several scenarios for auditing what SCOTT does while in a session. NOTE: if you need to work through this process several times you can delete the values in the AUD$ base table by issuing the TRUNCATE TABLE AUD$ command while logged in as the SYS user.

    1. Make sure that you are connected as user SYS.
    2. Display value of the parameter AUDIT_TRAIL. For the VALUE column you should have a value of DB_EXTENDED.
    3. Now we can set up auditing to track what goes on in the database.
      • Write SQL statements to audit successful and unsuccessful login attempts by SCOTT.
      • Write SQL statement to audit any successful INSERT, UPDATE or DELETE performed on table DEPT in scott's schema.
    4. Now we need to test the audits to verify that they work.
      • Log into the SCOTT user (remember that the password is LION) and perform the following:
      • write and execute an UPDATE statement that will change the value in the LOC column of the DEPT table to MIAMI where the DEPTNO value is 10.  Be sure to issue a COMMIT.
      • Write and execute the INSERT statement that will in insert the following values into DEPT - (50, 'LEGAL', 'HOUSTON').  Be sure to issue a COMMIT.
      • Write and execute the DELETE statement that will delete the row from the DEPT table that was just inserted in the step above.  Again, be sure to issue a COMMIT.
      • Try to reconnect to the SCOTT user with an invalid password.
      • Now connect back to the SYS user.

    Now we need to see if our auditing worked.

    1. While logged into your instance as the SYS user, query the DBA_AUDIT_OBJECT view of the data dictionary for the user name of the account (Not the OS), the object owner, the object name, the action name and the SQL command (text) from the DBA_AUDIT_OBJECT view in the Data Dictionary.
    2. Did you notice that the entries for successful logon and unsuccessful logon attempts were not there. Now query the user name, action name and return code values in the DBA_AUDIT_SESSION view. You should find that information here.

    Be sure to copy/paste your script and results sets output to the appropriate section in the Lab5_report document.

    Deliverables

    Submit your completed Lab 5 Report to the Dropbox. Your report should contain copies of each query and result set outlined in the lab along with the requested explanation of whether or not it satisfied the business requirement outlined for that particular section of the lab.

    Learn More
  9. DBM 449 Lab 6 SQL Analytical Extensions and Materialized Views

    DBM 449 Lab 6 SQL Analytical Extensions and Materialized Views

    $20.00

    For the lab this week we are going to look at how the ROLLUP and CUBE extensions available in SQL can be used to create query result sets that have more than one dimension to them. Both of these extensions are used in conjunction with the GROUP BY clause and allow for a much more broad look at the data.

    The first thing you will do for this lab is download the lab6_create.sql file and run the file in your database instance. This file will log into the DBM449_USER and then create and populate a set of tables that will be used for this lab.  Instructions for this are outlined in Step 1.

    To record your work for this lab use the LAB6_Report.doc found in Doc Sharing. As in your previous labs you will need to copy/paste your SQL statements and results from SQL*Plus into this document. This will be the main document you submit to the Dropbox for Week 6.

    LAB STEPS

    STEP 1: Setting up Your Instance

    For this lab you will be using a different user and set of tables than you have used so far for other labs. To set up your instance you will need to do the following.

    1. Download the lab6_create.sql file associated with the link to either the C drive on your computer or the F drive in your Citrix account.
    2. Open up the file and edit the login information at the top for the new user that is being created. You will need to replace the @ORACLE piece with the specifics for your instance name. DO NOT include AS SYSDBA after the name of your instance for this login.
    3. Now log into your instance as the SYS user. Run the script. The script is too long to copy/paste it into your SQL*Plus session so you should run the script using the @ sign from the SQL> prompt.
    4. Once the script has finished running then issue a SELECT * FROM TAB; sql statement. The result set will have tables from other labs as well but you want to make sure that you see the following tables listed.

    TNAME                          TABTYPE CLUSTERID
    ------------------------------ ------- ----------
    SUPPLIER                       TABLE
    PRODUCT                        TABLE
    DISTRICT                       TABLE
    CUSTOMER                       TABLE
    TIME                           TABLE
    SALES                          TABLE

     

    STEP 2: Using the ROLLUP Extension 

    In this section of the lab you are going to create a sales report that will show a supplier code, product code and the total sales for each product based on unit price times a quantity. More importantly the column that shows the total sales will also show a grand total for the supplier as well as a grand total over all (this will be the last row of data shown). To do this you will use the ROLLUP extension as part of the GROUP BY clause in the query. Use aliases for the column names so that the output columns in the result set look like the following.

    SUPPLIER CODE PRODUCT    TOTAL SALES
    ------------- ---------- -----------

    For this report you are going to use the SALES, PRODUCT and SUPPLIER tables. You should be able to write your query using NATURAL JOIN but if you feel more comfortable using a traditional JOIN method that will work just as well. When finished you should have a total of 16 rows in your report and the grand total amount should show 2810.74.

    Be sure to copy your SQL code and the result set produced and paste it into the appropriate place in the LAB6_REPORT.

    STEP 3: Using the CUBE Extension

    In this section of the lab you are going to create a sales report that will show a month code, product code and the total sales for each product based on unit price times a quantity. In this report the column that shows the total sales will also show a subtotal for each month (in this case representing a quarter) . Following the monthly totals for each product and the subtotal by month then the report will list a total for each product sold during the period with a grand total for all sales during the period (this will be the last row of data shown). To do this you will use the CUBE extension as part of the GROUP BY clause in the query. Use aliases for the column names so that the output columns in the result set look like the following.

         MONTH PRODUCT    TOTAL SALES
    ---------- ---------- -----------

    For this report you are going to use the SALES, PRODUCT and TIME tables. You should be able to write your query using NATURAL JOIN but if you feel more comfortable using a traditional JOIN method that will work just as well. When finished you should have a grand total amount of 2810.74 (same total as in the step 2).

    Be sure to copy your SQL code and the result set produced and paste it into the appropriate place in the LAB6_REPORT.

    STEP 4: Materialized Views and View Logs

    Materialized views, sometimes referred to as snapshots are a very important aspect of dealing with data when doing data mining or working with a data warehouse. Unlike regular views, a materialized view does not always automatically react to changes made in the base tables of the view. To help keep track of changes made to the base tables you must create what is call a Materialized View Log on each base table that will be used in the view. In this step of the lab we will do this.

    For the Materialized View we are going to create we are going to use the TIME and the SALES tables. Before we can create the view you will need to create a Materialized View Log on each of these two tables that will keep track of the ROWID and Sequence and include new values that have been added to the base table.

    Be sure to copy your SQL code and the result set produced and paste it into the appropriate place in the LAB6_REPORT.

    STEP 5: Creating and Using the Materialized View

    Now that we have our logs created we can progress on to the view itself. For this part of the lab you are going to create a Materialized View, demonstrate that the view works, insert a row of data into one of the base tables and then update the view. Finally, you will show that the new data is in the view. The following steps will help move you through this process.

    1. First, write the SQL CREATE statement that will create a Materialized View based on the following:
      • Name the view SALESBYMONTH.
      • Include clauses that will build the view immediately, completely refresh the view, and enable a query rewrite.
      • For the columns of the view you want to show the YEAR, MONTH, PRODUCT CODE, a TOTAL SALES UNITS, and a TOTAL SALES.
      • You will want to group the columns by year, month and product code respectively.
    2. Execute your script to create the view and then issue a SELECT * FROM SALESBYMONTH.

    The output columns from your view should look similar to the following (use aliases to format the column headings) and you should have 18 rows in the result set.


                                      YEAR      MONTH PRODUCT CO UNITS SOLD SALES TOTAL
                                  -------- ---------- ---------- ---------- -----------

    Now we are going to add some data and update the view. Because we have several derived columns in out view we will have to force the update as Oracle will not automatically update a view with this configuration.

    1. To begin with, insert the following data into the SALES table - (207, 110016, 'SM-18277',1,8.95).
    2. Now we are going to use a subprogram within the Oracle built in package DBMS_MVIEW. The REFRESH subprogram within this package will update our view so that we can see the new data.
    3. Write an SQL EXECUTE statement that will use the REFRESH procedure in the DBMS_MVIEW package (HINT: packagename.subprogram). The REFRESH subprogram accepts two parameters; the name of the materialized view to refresh, and either a 'c', 'f', or '?'. For the purposes of the lab use the 'c'. (you can refer back to pages 654-659 of the DBA Handbook readings for week 3).
    4. Execute your statement to update the view and then query the view once again.

    You should now see that the row for units sold in month 10 for SM-18277 has increased from 3 to 4 and total sales amount has gone from 26.85 to 35.80.

    Be sure to copy your SQL code and the result set produced and paste it into the appropriate place in the LAB6_REPORT.

    Deliverables

    Submit your completed Lab 6 Report to the Dropbox. Your report should contain copies of each query and result set outlined in the lab along with the requested explanation of whether or not it satisfied the business requirement outlined for that particular section of the lab.

    Learn More
  10. CTS2437 Final Exam SQL Server

    CTS2437 Final Exam SQL Server Administration

    $15.00

    CTS2437 – Final Exam Provide the SQL statements required to accomplish the following tasks. #1 (10 points) Create a database named FINAL_EXAM that you will then use for all remaining problems. #2 (20 points) Create the tables and appropriate constraints based on the following ER diagram. Use appropriate data types. Note that the size column should only accept S, M, or L. In addition the price column should have values greater than zero. All columns in both tables are required. Catgeory Product C #3 (5 points) Insert 3 rows in the Category table. The db is for a small shoe store, so use appropriate data for the description ( “Men”, “Women”, “Children”) #4 (5 points) Insert 3 Product records for each category in the product table. Use whatever data you see as appropriate. #5 (5 points) Use one statement to increase the price of all products in the Men category by 25%. #6 (5 points) Use one statement to delete all products for the Children category. #7 (10 points) Create and execute a view named EXAM_VIEW that shows all columns from both tables. Use an inner join. #8 (10 points) Create a database trigger named EXAM_TRIGGER that prevents a user from deleting a Product record on Tuesdays. Display an appropriate error message. Make sure to show that the trigger is working properly. #9 (10 points) Create a stored procedure named SP_EXAM that will be used to insert records into the Product table. Make sure to show that the procedure is working properly. #10 (5 points) Remove the EXAM_VIEW object from the database. #11 (5 points) Remove the SP_EXAM stored procedure from the database. #12 (5 points) Remove the EXAM_TRIGER database trigger from the database. #13 (5 points) Remove the FINAL_EXAM database. Learn More

Items 1 to 10 of 429 total

per page
Page:
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5

Grid  List 

Set Descending Direction
[profiler]
Memory usage: real: 14942208, emalloc: 14465624
Code ProfilerTimeCntEmallocRealMem