Nota Importante: OBIEE 11g solo se puede instalar con el metodo “Software Only” en ambientes de 64-bit

Share

Después de un par de intentos fallidos de solucionar un problema con el nuevo motor gráfico que no despliega algunas graficas en mis dashboards, los ingenieros de soporte de Oracle me han mencionado que la unica forma de instalar OBIEE 11g en ambientes Unix/Linux de 64 bits es usando la opcion de “Software Only” en el programa de instalacion e instalando WebLogic de forma separada.



La imagen en esta nota muestra la opcion “Enterprise” que instala una version 32-bit de WebLogic. La ventaja de usar un servidor virtual de SUSE Linux corriendo en Oracle Virtual Box es que puedo facilmente restaurar mi respaldo y empezar el proceso sin tener que instalar la base de datos y correr RCU de nuevo.


OBIEE 11g deployments on 64 bit environments must use

Espero poder publicar una guia de instalacion para ambientes de 64-bits, mientras tanto el vinculo a las notas donde se menciona esta limitante es este:

http://download.oracle.com/docs/cd/E14571_01/doc.1111/e14770/biee.htm#CHDIHIE

Tags: , ,
Posted in Español by Ignacio de la Torre. No Comments

Materialized View Maintenance – Part I: Guidelines

Share

I have been working with a lot of materialized views in our PeopleSoft Campus Solutions Warehouse lately and let me tell you, they can be a pain some times. Before I go into technical details here are some best practices that we have developed in our environment for data that is loaded once nightly:

  1. Do not use the ON COMMIT refresh method, it’s better to use REFRESH COMPLETE and  ON DEMAND or something like NEXT TRUNC(SYSDATE+1) + (1/24), this is generally applicable to most data warehouses with no real time data streams.
  2. If data warehouse users create contention for your MView refresh it is advisable to schedule these processes to occur while the database is unavailable to them, in our case that is during the maintenance window right after the warehouse comes online after backup.
  3. As with other physical database objects it is advisable to separate indices from the base data, the easiest way to do so is using a different oracle tablespace to host them.
  4. Make sure to create a BITMAP index for each surrogate key column in your view to leverage oracle star joins when query rewrite is enabled.

It is important to note that these materialized views are being used as helper tables to  determine the most recent surrogate key values for each dimension entity and the datasets are fairly small, we are testing partitioning some of this data but haven’t come up with an optimal partitioning method so that we can reap the benefits of partition elimination to the degree we would like.

Check out the Puget Sound Oracle User Group page for the pacific northwest for a deeper look at Materialized Views and DBMS_MVIEW.

OBIEE Repository Reporting

Share

I made a nice discovery today when trying to extract a list of the physical table aliases under the Global Dimensions physical display folder in my PeopleSoft EPM Fusion Intelligence repository. After several frustrating attempts to use the query repository dialog I discovered that I the appropriate filters are automatically populated if I right click the physical display folder and select “Display Related > Physical Table”. Once I got this figured out I was able to export the list of tables to a CSV file that I could manipulate and format with excel. Here’s a picture to help you find the dialog:

How to get a list of objects contained in a physical display folder

How to get a list of objects contained in a physical display folder

Tags: , ,
Posted in Fusion Intelligence OBIEE Tips by Ignacio de la Torre. No Comments

OBIEE Logical Table vs Logical Table Sources – Best Practice

Share

The difference between Logical Tables, Logical Table Sources (LTS) and the physical tables that compose a Logical Table Source, here are some basic guidelines that have helped me get this concept to seat more clearly in my mind.

businesmodellayer-logical-table LT LTS

BMM Layer - Logical Table and LTS

Logical Tables like organization in the picture on the left are used to create drill down paths or dimensions like Dim_Organizaition.

Usually when you drag and drop a column from a table that is not currently being used in your logical table, the physical table containing such column gets added as a new Logical Table Source (LTS) such as UofA on the image on your left. This most usually is not the result you should be aiming for, in general, when a the column you are adding to a Logical Table comes from the same system of record you have been using you should just rename the LTS to reflect the name of the system of record in the same fashion we renamed ours to UofA and then go edit the sources for this LTS.

The confusion often stems from having what I would call “Logical Table Source” Sources (LTSS). The image below depicts them for UofA, you can acces this dialog by double clicking any LTS.

BMM - Logical Table Source Properties and LTSS

BMM - Logical Table Source Properties and LTSS

Physical Table FRS_GL_ACCOUNTS is a “Logica Table Source”  Source (LTSS) for Logical Table Source UofA. The rationale again is that UofA represents a system of record (source) that brings in information from more than one physical table to build our organization logical table.

If you still have questions add comments to this post and let’s get the discussion started :)

Tags: , , , , ,
Posted in OBIEE Reference Tips by Ignacio de la Torre. 2 Comments

OBIEE 10g Administration – useful linux environment variables

Share

I have been working a lot on content migration and service configuration of our linux-based OBIEE servers lately. One thing that has definitively helped me a lot is that I took some time to define these linux environment variables on the linux account’s ~/.bash_profile file:

OBI_HOME=  #this helps me navigate really quick to my OracleBI folder (ie. OBI_HOME=/some/really/long/path/OracleBI)
export OBI_HOME

OBI_DATA=  #this helps me navigate really quick to my OracleBIData folder (ie. OBI_DATA=/some/really/long/path/OracleBIData)
export OBI_DATA

OBI_REP=  #this helps me navigate really quick to my OBIEE repository folder (ie. OBI_REP=$OBI_HOME/server/Repository)
export OBI_REP

OBI_CAT=  #this helps me navigate really quick to my OBIEE presentation server catalogs folder (ie. OBI_CAT=$OBI_DATA/web/catalog/)
export OBI_CAT

Now I can move to my repository folder from anywhere in the file system by just typing:

$ cd $OBI_REP

Install a local Subversion server on your windows machine

Share

I need to test some of Subversion’s functionality without our server admins hating me at work so I decided to build me a local server based on this  post by Jeff Atwood, I installed the server with the Apache 2.o bindings from SVN’s website. So after getting the binaries in my computer’s hard disk this is how it went:

  • I first listed the create command help page to check out options and parameters:
    svn_create_command_help
  • It was hard to believe how easy it is to create a repository:
    svn_create_command
  • This is the directory structure in the new repository
    svn_create_command_2
  • This is how the conf directory looks on the inside:
    04_svn_create_command_3
  • This is how the password file looks like out of the box:
    svn_create_command_4
  • I followed instructions to the line when editing the svnserve.conf file:
    svn_create_command_5
  • I love the sc command, it is usually way more flexible than “net”, when doing “sc create” to create a windows service always remember to separate each parameter’s value from the equals simbol with a space or otherwise the command won’t work:
    svn_run_as_service
  • Set the SVN_EDITOR variable to Notepad’s path:
    svn_run_as_service_2
  • Now that your SVN service is started you are ready to create the folder structure in your brand new repository, make sure that you use the SVN protocol as the first component of your project URL, I was using http:// and file:// and getting nowhere until I realized the correct protocol is SVN://
    svn_mkdir_error
  • You can use the –parents modifier of the mkdir command to save some keystrokes by creating a full directory path at once:
    svn mkdir --parents
  • I downloaded and installed TortoiseSVN a very nice SVN client that works as a windows shell extension, by right-clicking on any windows explorer window I get access to the menu shown below:
    tortoiseSVN_repository_browse
  • After clicking Repo-browser and entering svn://localhost I can browse the folder structure I just created in my repository in a nice graphical format:
    tortoisesvn_repo-browse

For any inquiries on this topic please post to the General Discussion or Best Practices threads on the THE Business Intelligence Forum, our better alternative to sites that make you pay to look at answers like dsxchange or that disallow references to any of their competitors like ITToolbox…

Tags: ,
Posted in Subversion by Ignacio de la Torre. No Comments

Issues with DB Connection when creating the Informatica Power Center Repository

Share

image

This was one of those errors that are very simple yet take a lot of your time to figure out. When trying to create a Informatica Power Center Repository Service in the web interface for the administration console I ran into a “An Exception has occurred. The database connection detail for service INFA_REPO_SVC are invalid. Please check log for details”, after a long time searching online and looking at different forums I finally came across a post that recommended you use the following  format for your repository database entry on tnsnames.ora:

BIAPPS.world =

(DESCRIPTION =
(ADDRESS =
(COMMUNITY=mycompany.world)(PROTOCOL = TCP)(HOST =myhost)(PORT = 1521)
)
(CONNECT_DATA =
(SERVER = DEDICATED)
(SID=biapps)
(GLOBAL_NAMES= biapps.world)
(SERVICE_NAME = biapps)
)
)

Once I changed my tnsnames.ora I was able to connect and create my repository. Check this other post for the script I used to create the database account and storage configuration for the repository.

Tip: Getting Unix command output and status in your email

Share

I have a scheduled script that refreshes some of my materialized views but sometimes these operations could fail if the database backup takes too long or an unexpected condition presents. This led me to think of sending the output of my refresh procedures to my email or my team’s listserv, here are sample versions of my scripts:

daily_mview_refresh.sql:

--connect to the database
conn / as sysdba
--Refresh Instructor FTE view
begin
PKG_SA_ETL_MAINTENANCE.refresh_instructor_fte;
end;
/
--Refresh Current Term Enrollment View
begin
PKG_SA_ETL_MAINTENANCE.refresh_term_enrl_current;
end;
/
exit

daily_mview_refresh.sh:

#!/bin/sh

#re-schedule script to run again tomorrow
at 5:50am tomorrow -f /path/db_maintenance/daily_job.sh

#Set ORACLE_SID
ORACLE_SID=MYDB
export ORACLE_SID

#Execute daily_mview_refresh.sql using SQL*Plus
date > daily_mview_refresh.log

echo "Materialized view refresh starting" >> daily_mview_refresh.log

sqlplus /nolog < daily_mview_refresh.sql >> daily_mview_refresh.log

echo "Materialized view refresh completed" >> daily_mview_refresh.log

date >> daily_mview_refresh.log

#email results
mail -s "Materialized View Refresh Output" ETL-List@mycompany.com < daily_mview_refresh.log

Something to have in mind would be to incorporate appropriate error handling in the stored procedures being called and also providing some output that would indicate progress through the materialized view refresh process. In the scripts I actually use I also collect statistics and rebuild some indices, each view is refreshed in a different procedure as to contain any errors and allow the remaining views to be built anyway.

Tags: , , , , ,
Posted in Misc Technical Tips by Ignacio de la Torre. No Comments

Configure your database to host your Informatica Repository (Oracle 11g)

Share

The Informatica repository is designed to reside in a database system, for my local test environment I will be hosting it in an Oracle 11g database. Here are two design principles that need to be taken into consideration when setting up the database that will host your repository:

  • Creating a dedicated tablespace for your repository data can make it easier to backup and maintain your repository database.
  • Given the fact that repository data has a footprint that is significantly smaller than that of your data warehouse, it is recommended to optimize your Oracle STORAGE parameters to minimize initial table size and minimize chained rows.

When creating a production repository additional considerations should be taken to ensure that your backup strategy will allow you to always recover the repository data from a potential failure with full integrity. To help you get started I’m providing a template for a script that will create a user, a tablespace and grant basic permissions to your user, I’m using this configuration for my local test install.

Download SQL_USR_INFA_REPO.sql Download / View

 

image

image

image

image

image

image

Review the step by step list of activities by going back to the Pre-Install Task List

Technorati : Informatica, Oracle, PowerCenter, Repository, install
Del.icio.us : Informatica, Oracle, PowerCenter, Repository, install
Zooomr : Informatica, Oracle, PowerCenter, Repository, install
Flickr : Informatica, Oracle, PowerCenter, Repository, install

Subversion & TortouseSVN reference found online

Share

Here’s a list of useful reference I’ve found online while trying to setup our DataStage ETL & OBIEE version control environment:

  • Subversion best practices:

    http://svn.collab.net/repos/svn/trunk/doc/user/svn-best-practices.html

  • 30 minutes windows setup (this is how I got my test instance up on my laptop):

    http://www.codinghorror.com/blog/archives/001093.html

  • Subversion FAQ:

    http://subversion.tigris.org/faq.html

  • The Subversion Book:

    http://svnbook.red-bean.com/nightly/en/index.html

Tags: , ,
Posted in Reference Subversion Technical by Ignacio de la Torre. No Comments

bounce_like_a_tiger.sh

Share

Since the rest of the team found the name of the script I created to bounce our OBIEE server very comic I decided to share bounce_like_a_tiger.sh with you:

#!/bin/sh

#Initialization

#get the action/parameter
command=$1

bin_ps=/bin/ps
bin_awk=/usr/bin/awk
bin_grep=/usr/bin/grep

#validate that the action to execute is either start, stop, forcestop or autorestart
if [ "$command" = "start" -o "$command" = "stop" -o "$command" = "forcestop" -o "$command" = "autorestart" ]; then
#set work directory
OBI_HOME=/opt/OBIEE/OracleBI
export OBI_HOME
else
echo “Invalid parameter…”
echo “Syntax: ”
echo ”         obiee.sh { start | stop | forcestop | autorestart }”
exit 1
fi

if [ "$command" = "start" ]; then

echo “obiee.sh WILL START ALL SERVICES…”
echo “”

#Step 1: execute command on OBI Server
processname=run-sa.sh
echo “Executing: ”
echo ${OBI_HOME}/setup/run-sa.sh $command
${OBI_HOME}/setup/run-sa.sh $command
ec=$?
echo “exit code: ” $ec
echo “…”
echo “…”
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-sa.sh … OBIEE.sh is aborting…”
exit 1
fi

#Step 2: execute command on OBI Presentation Services
echo “Executing: ”
echo ${OBI_HOME}/setup/run-saw.sh $command
${OBI_HOME}/setup/run-saw.sh $command
ec=$?
echo “exit code: ” $ec
echo “…”
echo “…”
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-saw.sh … OBIEE.sh is aborting…”
exit 1
fi

#Step 3: execute command on OBI Scheduler
echo “Executing: ”
echo ${OBI_HOME}/setup/run-sch.sh $command
${OBI_HOME}/setup/run-sch.sh $command
ec=$?
echo “exit code: ” $ec
echo “…”
echo “…”
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-sch.sh … OBIEE.sh is aborting…”
exit 1
fi

#Step 4: execute command on OBI Cluster Controller
echo “Executing: ”
echo ${OBI_HOME}/setup/run-ccs.sh $command
${OBI_HOME}/setup/run-ccs.sh $command
ec=$?
echo “exit code: ” $ec
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-ccs.sh … OBIEE.sh is aborting…”
exit 1
fi

elif [ "$command" = "stop" -o "$command" = "forcestop" ]; then

echo “obiee.sh WILL SHUTDOWN ALL SERVICES…”
echo “”

#Step 4: execute command on OBI Cluster Controller
echo “Executing: ”
echo ${OBI_HOME}/setup/run-ccs.sh $command
${OBI_HOME}/setup/run-ccs.sh $command
ec=$?
echo “exit code: ” $ec
echo “…”
echo “…”
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-ccs.sh … OBIEE.sh is aborting…”
exit 1
fi

#Step 3: execute command on OBI Scheduler
echo “Executing: ”
echo ${OBI_HOME}/setup/run-sch.sh $command
${OBI_HOME}/setup/run-sch.sh $command
ec=$?
echo “exit code: ” $ec
echo “…”
echo “…”
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-sch.sh … OBIEE.sh is aborting…”
exit 1
fi

#Step 2: execute command on OBI Presentation Services
echo “Executing: ”
echo ${OBI_HOME}/setup/run-saw.sh $command
${OBI_HOME}/setup/run-saw.sh $command
ec=$?
echo “exit code: ” $ec
echo “…”
echo “…”
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-saw.sh … OBIEE.sh is aborting…”
exit 1
fi

#Step 1: execute command on OBI Server
echo “Executing: ”
echo ${OBI_HOME}/setup/run-sa.sh $command
${OBI_HOME}/setup/run-sa.sh $command
ec=$?
echo “exit code: ” $ec
if [ "$ec"  != "0" ]; then
echo “An error occurred running run-sa.sh … OBIEE.sh is aborting…”
exit 1
fi

fi

exit 0

Tags: , , , ,
Posted in OBIEE by Ignacio de la Torre. 1 Comment

Uninstall Oracle Application Server

Share

Here’s my latest run on uninstalling Oracle Application Server 10.1.3.0 from a SuSE 10 Enterprise box:

Important: Be sure to backup your OAS instance and the OAS repository database before you proceed with these instructions.

  1. Run oracle universal installer from the OAS home directory:
    $ORACLE_HOME/oui/bin/runInstaller
  2. Click on “Deinstall products…” and select the oracle home where your application server resides.
  3. Click on “Remove” and confirm you want to remove OAS on the dialog that pops up.
    Once the progress bar dialog disappears your application server binaries have been removed.
  4. Backup the directories directly above your ORACLE_HOME directory
    ls $ORACLE_HOME/..
    cd $ORACLE_HOME/..
    mkdir oas_inst_bak1
    #modify accordingly
    mv OracleAS_1 inst_bak_1/
    mv OracleAS_2 inst_bak_1/
    mv OracleAS_3 inst_bak_1/
  5. Log in to your  OAS repository database (the one that’s already backed up) and drop it.

P.S.: Complete instructions to be found on Oracle’s OTN at http://download.oracle.com/docs/cd/B32110_01/install.1013/b32198/deinst.htm#CHDHFGDA

Tags: , ,
Posted in GoogleTracking OAS by Ignacio de la Torre. 1 Comment

PMO Documentation Templates

Share

Yesterday I had the opportunity to take a first pass on desigining documentation and project management templates for our PeopleSoft implementation along with two of my co-workers, here’s some of the lessons we have learned as we started working with the KISS principle in mind:

APPROACH

  • We are a state institution beginning a very large ERP implementation with limited resources so project controllership is a key role but we strive to keep resource overhead related to our documentation templates to a minimum.
  • We started with a list of documents in mind but we also created a list of sample requiremens and project management activities ranging from “Implement HR Benefits Administration”, “Add XYZ account to chart of accounts” to “Add space to server logical volume”, this helped validate our list of documents could accomodate all kinds of activities and figure out some of the attributes/information that should be captured in all of them.
  • Some of our initial documents are:
    • Statement of work / project plan
    • Fastrack Project Charter (for smaller projects)
    • Functional Requirements Speficication
    • Technical Requirements Specification

USABILITY

  • We took a requirements gathering class at the TDWI Conference and the instructor provided a neat excel spreadsheet where input to certain columns was done via drop down menus, this is a very nice way to maintain consistency and drive standardization.
  • We try no to include any non-essential information in these forms to minimize overhead and allow the form to be as generic as possible.

ACCESSIBILITY

  • Becky and Mary have foudn a great way to make these templates available by creating a folder hierarchy that resembles project organization in a file share accessible to all members of the project team.

More to come on this as the project keeps rolling… enjoy  your weekend!

OBIEE MUDE Repository Life Cycle

Share

I just finished OBIEE Repository Life Cycle diagram in visio I hope it can be of help during training and in your organizations. The attempt today was to illustrate in a very simple diagram the metadata development process in a OBIEE multi-user development environment (MUDE).

Bon Soir!
-Ignacio

Tags: , , , ,
Posted in Best Practices OBIEE Reference by Ignacio de la Torre. 1 Comment

Drop [oracle] database (and clean after yourself)

Share

Hey I’ve got to document this so I don’t forget how to cleanup after myself:

  1. FIRST, find out which files to cleanup after dropping the database thanks to geekinterview.com
    select * from dba_data_files;
    select * from v$logfile;
    select * from v$controlfile;
    archive log list
    initSID.ora
    In addition you can clean the UDUMP, BDUMP, scripts etc
  2. Now go ahead and drop that datatabase (the one that’s already backed up):
    ORACLE_SID=<OAS Repository SID>
    export ORACLE_SID
    sqlplus /nolog
    conn / as sysdba
    shutdown immediate
    startup mount exclusive restrict;
    drop database;
    exit
  3. Now go and cleanup those files!
Tags: , , ,
Posted in Oracle Database Tips by Ignacio de la Torre. No Comments

PeopleSoft EPM Dimension NULL records

Share

After successfully loading compensation mart I had to help bring up the rest of the marts up to speed. The most common data issue the rest of the facts were experiencing was dimensions that weren’t being implemented like PS_D_GOV_POI.

The PS_POSITION_DATA.GVT_POI and PS_POSITION_DATA.GVT_SUB_AGENCY columns in HCM are being defaulted to a blank space by our conversion programs which is kind of fine since there’s no relational integrity constraints at the database level, so if the position conversion program defaults the fields to a blank space or a value that does not exist on  PS_GVT_POI_TBL or PS_GVT_POI_TBL no warnings or errors are raised.

Now this is not fine on the warehouse, the blank spaces are converted to dashes (‘-’) by the dimension load programs and since PS_GVT_POI_TBL contains no rows PS_D_GOV_POI is an empty dimension.

The solution to all these issues is very simple, all dimensions that are NOT being implemented should contain what I’m going to call “NULL Rows” for lack of a better term. These NULL Rows have code values in the dimension that match the default values used for the conversion program so that joins like the following would work:

TRANSACTION:

SELECT * FROM PS_POSITION_DATA, PS_GVT_SUBAGCY_TBL WHERE PS_POSITION_DATA.GVT_SUB_AGENCY=PS_GVT_SUBAGCY_TBL.GVT_SUB_AGENCY

WAREHOUSE:

SELECT *
FROM PS_GVT_SUBAGCY_TBL,PS_D_GOV_SUBAG
WHERE PS_D_GOV_SUBAG.GOV_SUBAG_CD=PS_GVT_SUBAGCY_TBL.GVT_SUB_AGENCY
AND PS_GVT_SUBAGCY_TBL.COMPANY=’<your company code here>’

Tags: ,
Posted in PeopleSoft EPM PeopleSoft EPM Functional by Ignacio de la Torre. No Comments

PeopleSoft Enterprise Performance Management Warehouse Schema Naming Standards

Share

The nicest thing about PeopleSoft after the whole PIA Metadata-driven interface and its impact on data quality is how meticulous PeopleSoft engineers are about documentation and sticking to naming standards, here’s a simplified diagram of the wharehouse architecture and the naming standard for objects in each section of the schema:

peoplesoft epm schema naming standards

peoplesoft epm schema naming standards

DataStage: ParamName does not reference a known parameter of the job

Share

Today I got an error that reads like this:

JobControl (@<JOB_NAME>): Controller problem: Error
calling DSSetParam($<PARAMETER_NAME>), code=-3
[ParamName does not reference a known parameter of the job]

After a quick debug/issue tracking session on Director I found out that
one of the jobs in my master sequence was missing all of our standard
parameters with database user, schema and password, since we define
them at the project level all I had to do is click on All to
default and the problem was promptly fixed.

Tags: , , ,
Posted in PeopleSoft EPM Uncategorized by Ignacio de la Torre. No Comments

Abnormal termination of stage (cast those CLOBs!)

Share

As I started running a bunch of our delivered DataStage ETL jobs I ran into a really undescriptive warning message in several Director job logs.  Most of tem failed on the IPC stage right after the DRS stage that was extracting data out of our source database with an “Abnormal Termination of  Stage” error. After several attempts to turn on tracing and trying to find a rational reason for the error I went into my source database and realized  that there were CLOB columns in every table that was having this issue. After some testing the issue was resolved by casting the CLOB columns into the varchar2 data type:

CAST(my_column AS VARCHAR2(4000))

Now, these were jobs that were delivered with our warehouse, the designers were probably trying to make them as general as possible, after some conversations with the warehouse team I now got authorization to replace the CLOB content with empty strings to save space as the data in those columns is of low value to our BI users.

Google Webmaster Tools

Share

If you feel like tracking how the rest of the world sees your blog (and I guess we can assume the rest of the world uses Google of course), adding Google Webmaster Tools is a very good idea:

Google Webmaster Tools:

https://www.google.com/webmasters/tools

How to add them to Word Press:

http://faq.wordpress.com/2008/08/08/how-do-i-verify-my-site-with-google/

Slider by IWEBIX Webdesign