Quantcast
Channel: Oracle Articles / Blogs / Perficient
Viewing all 934 articles
Browse latest View live

Planning, Budgeting and Forecasting Pitfalls to Avoid in 2015

$
0
0

Many companies struggle with creating effective, insightful budgets and forecasts. The existing processes for creating budgets and forecasts are often highly inefficient and outdated.

shutterstock_175977845Throughout my consulting career, I have come across several consistent mistakes that most companies make when setting up their planning, budgeting and forecasting processes. What I am sharing in this blog are tips for addressing these pitfalls as we head into 2015. These best practices can hopefully lead to creating new opportunities to achieve very significant improvements in the quality of the forecasting information delivered to your client’s executive management team and line managers.

1. Problem: Excessive amount of detail

The inputs for budgets and forecasts are collected at a too low of a level. Too many inputs cause the budgeting and forecasting process to become very time consuming and inefficient. Instead of being able to obtain a new monthly forecast soon after the accounting close each month, the decision makers within a given company need to wait till the third of fourth week of the month to receive an updated, actionable re-forecast.
Furthermore the additional, detailed inputs prevent the budget or forecast from being accurate. This finding is rather counterintuitive but I found it too be true at many of my clients. In fact, one my clients in the technology industry employed a very extensive monthly forecasting process which consistently created forecasts which were about 20% of the mark when compared to the actuals for a given month. Our client sponsor, the new CFO of that company, was so frustrated with the existing forecasting process that he had asked his secretary to run a quick forecast of her own every month using the Crystal Ball software. That forecast, created by a secretary with no Finance background, used very few inputs and yet managed to produce very accurate monthly forecasts.

Solution: Reduce the level of detail in the annual Budget and the monthly Forecast

The annual Budget should contain enough detail to support a robust annual bottoms up planning process. It should not try to “boil the ocean” and try to plan for the level of detail that actuals typically come in at.
The monthly Forecast should be done on “exception” basis only. i.e., if there is significant change in the business outlook it should be reflected in the new Forecast. If not, the Forecast should simply be tweaked to account for the last month’s actuals. On quarter end months the Forecast needs to be revised from the bottoms up and all inputs for all accounts needs to be revised.

2. Problem: Too much focus on the annual Budget

Most companies still go through an annual budgeting process which typically takes anywhere between 4 to 7 months and puts a significant burden on the Finance department. Once the final version of the Budget is approved, the assumptions in it are typically so stale and meaningless that the entire Budget can be almost totally discounted.

At the same time companies do not employ the most effective forecasting tool at their disposal: rolling forecast.

Solution: If this solution sounds radical that is because it is: companies should stop creating annual Budgets. They provide little additional value and tie up significant Finance resources for an extraordinarily long amounts of time.

Leading companies rely on 18 month rolling forecasts to create a permanent 18 month long window into the future. It allows their managements to effectively plan their business operations for a meaningful, constant period of time. A typical 12 month annual forecast (not a rolling one) becomes less and less relevant as the year progresses and does not provide a long enough window into the future for the management to make effective longer term decisions.

My clients who changed their Planning, Budgeting and Forecasting processes to avoid the two pitfalls mentioned above had been able to achieve very significant improvements in the quality of the forecasting information delivered to their executive management team and their line managers. This in turn, led to much improved decision making and significant business growth.


Oracle Critical Patch Update Released | 169 fixes

$
0
0

Oracle recently released it’s January 2015 Critical Patch Update that includes 169 new security fixes across the following product groups:

  • Oracle Critical Patch UpdateOracle Database
  • Oracle Fusion Middleware
  • Oracle Fusion Applications
  • Oracle Enterprise Manager
  • Oracle Applications – E-Business Suite
  • Oracle Applications – Oracle Supply Chain, PeopleSoft Enterprise, JD Edwards Product Suite, Siebel and iLearning
  • Oracle Communications Industry Suite
  • Oracle Retail Industry Suite
  • Oracle Health Sciences Industry Suite
  • Oracle Java SE
  • Oracle and Sun Systems Products Suite
  • Oracle Linux and Virtualization Products
  • Oracle MySQL

None of these database vulnerabilities are remotely exploitable without authentication, but a number of them include severe vulnerabilities.

Oracle highly recommends implementing this critical patch as soon as possible.

Oracle has received specific reports of malicious exploitation of vulnerabilities for which Oracle has already released fixes. In some instances, it has been reported that malicious attackers have been successful because customers had failed to apply these Oracle patches. Oracle therefore strongly recommends that customers remain on actively-supported versions and apply Critical Patch Update fixes without delay.

Integrating HFM and Planning Multi-Currency Applications

$
0
0

I am often asked by clients and colleagues why Hyperion Financial Management and Hyperion Planning are so different.  It’s typically when they are struggling to build an interface between the two, and they can’t reconcile converted currencies.  So today I want to share some of the industry’s best practices that should be taken into consideration when designing an integration between HFM and Hyperion Planning multi-currency applications.

It is important to understand how currency translation works in these two products before designing your integration solution. The following topics will be discussed:

  • Key differences between HFM and Planning dimensions
  • Currency translation in HFM
  • Currency translation in Hyperion Planning
  • A simple integration strategy using native Hyperion tool

Key differences between HFM and Planning

Hyperion PlanningHyperion Financial Management
Primarily used to develop a Planning, Budgeting and Forecasting solutionPrimarily used for Financial Consolidation purposes
Uses two data sources: Essbase database stores numeric and date datatypes and the relational store captures the non-numeric datatypes such as text etc.Stores data in relational store
In a block storage option cube, you can have a maximum of 20 dimensions (19 + 1 for Currency) in Planning.

The 6 standard dimensions are:
• Year
• Period
• Scenario
• Version
• Entity
• Account
Pre-11.1.2.3 - 12 dimensions - 8 standard and 4 custom dimensions.

11.1.2.3+ – 8 standard, unlimited custom dimensions

The 8 standard dimensions in are:
• Scenario
• Period
• Year
• View
• Entity
• Value
• Account
• ICP
Calculations are written using Essbase calculation scripts, also commonly known as business rulesCalculations are written using modified VB Script

Currency Translation in HFM


Before the deployment of an HFM application for the first time, some application and dimensional configurations are performed. It is helpful to know some of these application properties that are set at the time of the application creation.

FROM and TO Dimension Settings:

The FROM and TO dimensions define where to store the currency exchange rates. Any custom dimensions can selected but they cannot be changed after the creation of the application.

To identify this property in EPMA, select a custom dimension and check the property “Use for Currency” in the property grid to determine this setting.  In this case, we will use Custom1 and Custom2.

IBHH001

IBHH002

Other Application and Dimension properties    

  • Default Currency: Generally set to USD
  • Default rate for Flow Accounts: Generally set to AverageRate
  • Default rate for Balance Accounts: Generally set to ClosingRate
  • Present Value Add (PVA Translation Method): Accumulates the physical value of an account. Usually enabled for Flow Accounts
  • Accounts assigned to translate Balance or Flow in Application Settings
  • Define the local currency that is associated with each entity in the metadata

IBHH003

IBHH004

IBHH005

Once these properties are set, HFM has a delivered approach to currency translations and consolidations.  The typical process in HFM has the following steps:

  1. Open future period
  2. Load data
  3. Run consolidations on an Entity
  4. The system runs calculation rules for all descendants of the Entity.
  5. If the data for the child Entity and the data for the parent entity are in different currencies, the system translates data based on the local or global exchange rate. For the child entity, the translated value is stored in the Parent Currency member of the Value dimension. The translated value in Parent Currency is rolled up to the parent.
  6. Optionally, you can add adjustments to base data through journals. These are stored in <Entity Currency Adjs> or <Parent Currency Adjs> or <Parent Adjs>.
  7. The consolidation process continues. If the parent’s ownership of the child is less than 100%, the ownership percentage is applied. The system generates proportion and elimination detail, and creates contribution data.
  8. You can make further adjustments to contribution data through journals.
  9. And reapprove
  10. Lock data (Lock/close the month)
  11. Publish reports

Currency translation can be performed using the delivered approach or can be further customized using translation rules to suit your accounting and reporting needs. For the delivered approach, the currencies and rates are defined as part of the metadata definition. The exchange rates are entered in the memo accounts above (AverageRate, ClosingRate) and can be loaded to a global entity or a specific entity.

Value dimension

The value dimension is one of the eight standard dimensions in HFM. It is a system defined dimension that stores the different types of values in the HFM application for a given entity, including currencies.

The diagram below shows how the data flows through the system as part of the consolidation process. This is where the actual translations are stored. The value dimension is important when viewing and extracting the right intersection to export translated and untranslated data.

IBHH006

Currency Translation in Planning


Delivered Approach

A multi-currency setting is enabled at the time of the creation of a Planning application. The default application and reporting currency is generally set to USD.  Additional reporting currencies can be enabled as needed.  A multi-currency application has two additional dimensions.  Both of these dimensions are, by default, sparse in nature.

  • Currency: defines local and reporting currencies.
    • Local Currency
    • USD
    • EUR
    • GBP
  • HSP_Rates: stores data values and exchange rate information.
    • HSP_InputValue: stores data values
    • HSP_InputCurrency: stores the currency type for HSP_InputValue above
    • HSP_Rate_USD
    • HSP_Rate_EUR
    • HSP_Rate_GBP etc

Example, the HSP_InputValue might be 500 and the HSP_InputCurrency might be GBP.

Note: The exchange rates are initially stored in the relational planning schema and pushed to Essbase.

Configuring Exchange Rate Tables

Before defining the Exchange Rate table for the first time, the following pre-requisites should be met:

  • Define the currency members the Currency dimension
  • Associate the base currency with the members in the Entity dimension
  • Navigate to Administration > Manage > Exchange Rates
  • Define an Exchange Rate table and enter the rate information into the table

There are three exchange rate types associated with a currency:

  • Average
  • Ending
  • Historical

The exchange rate type is associated with the account members in their properties. You also must associate the exchange rate to a scenario for which you want to perform the currency conversion.

Next, create the Currency Conversion Script:

  • Navigate to Administration > Manage > Currency Conversion
  • Provide the currency conversion script name – e.g. Budget
  • Choose the reporting currency – USD
  • Choose the scenario that you have the Exchange Rate table attached
  • Choose the version associated with the scenario
  • Select the version type : Bottom-up or Target
  • On Save, two calculation scripts are generated: HspCRtB.csc , Budget.csc

The exchange rate information is initially stored in planning application’s relational database. Upon deployment of the application, the rates are stored in Essbase.

These are planning relational tables where the exchange rate information is stored:

  • HSP_FX_RATE_VALUES
  • HSP_FX_RATES
  • HSP_FX_TABLE
  • HSP_FX_VALUES

Execute the conversion script HspCRtB.csc:

  • This script copies the appropriate rates to each account based on the account’s properties.
  • This script must be executed any time the exchange rates have been updated.

NOTE:  The script is available in all three plan types and therefore must be executed for each one individually.

Integration between HFM and Planning


The keys to building a proper interface for currency integrations are:

  • Understanding the integration points (for the Master data and Data), especially how and where the data that needs to be interfaced are stored in both the source and target destinations
  • Establishing naming standards as part of the design definitions
  • Selecting the appropriate tools for integration solutions

The diagram below depicts a sample interface from HFM to Planning.  This example relies on Extended Analytics and EPMA Batch Client to move data between the applications:

IBHH009

 

 

 

 

 

 

 

 

 

 

 

 

These are the steps in the data flow:

Process Step
SystemDescription of StepProcessing Logic
1HFMExtended Analytics is a tool that is used to export HFM data directly to a relational database in a star schema format.
The integration process can be further automated using Task flows.
Configure the destination database after setting up a UDL file database connection.
Use Extended Analytics to export the data to the Hyperion interface staging area.
2Hyperion Interface Staging AreaThe data is transformed using SQL views to define necessary mappings and filters ready to be interfaced to the target planning application.The Hyperion interface staging area is a data mart where the data is populated in a star schema format for the data extract out of HFM. This allows for further transformation of the data using SQL database views to transform the data to an acceptable format.
3Hyperion PlanningThe Data Synchronization module is only available for EPMA Applications and provides the ease of transferring data across EPMA applications. Data Synchronization can also be extended to use Flat files and database views as data sources.

Data Synchronization Module is generally used to sync data between any two Hyperion applications (for example, HFM and Planning) where the dimensions and the grain are almost identical.

The audit and notifications are delivered features of the Oracle Data Synchronization process. Data synchronization can be executed from Workspace or ODI can be automated to execute data synchronization rules using EPMA batch client to interface data to Hyperion planning or automated using the delivered Task flow feature for EPMA applications
Configure the interface data source and define the EPMA views to be able to interface the data.

Define Data Synchronization Mappings to directly interface the data from the staging area to the Hyperion Planning application.

IBHH0010

 

 

 

 

 

 

Using extended analytics, the data is extracted out of HFM to a dimensional data model where the data layout is in star schema format. There are various extract formats such as Standard, Essbase, Data Warehouse, etc. The best practice is to use the Data Warehouse format. The database topology is like a star and consists of two sets of tables: Fact tables and Dimension tables.

The two levels of data designed to be exported out of HFM are:

  1. Base data – This level contains the data from the base entities.

 

IBHH0011

 

 

 

 

 

 

 

 

 

 

—————————————————————————————————————————-

2. Adjustments data – This level contains all adjustments data. The different adjustment types exported are Eliminations, Parent and Contribution Adjustments.

Planning only stores data at the base level for a bottom up version. Hence base members were created in Planning with a suffix ‘_Elim’ in the Entity dimension that will store the adjustment data, and the currency at this level is identified by the default currency of the parent.

IBHH0012

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Once the data is in the staging warehouse, there are a variety of tools and techniques to load the data to Planning.  In our simple example, we used EPMA Data Synchronization.  Some of the other techniques which are commonly used are:

  1. Data Load using Essbase load rule
  2. Oracle Data Integrator (ODI)
  3. Financial Data Quality Management (FDM)

In my next blog, I will discuss alternatives to using the default currency dimensions in Planning.  This often makes interfaces between HFM and Planning even more straightforward.  Thanks!

Hyperion Corporate Tax Provision product overview & key benefits

$
0
0

Hyperion Tax Provision is a powerful tool that leverages HFM capabilities and functionalities to streamline the tax close processes and enhance tax automation. Let us introduce HTP to tax departments as follows:

HTP Product Overview & Key Benefits:

The significant differences between HTP and its competition include:
– Not a bolt-on point solution for most companies that already use Oracle Hyperion EPM products.
– Based on a proven Oracle Hyperion EPM solution platform (HFM, FDM, HFR, Smartview, EPMA, etc.) with pre-existing features shared with other applications (FX translation, mobile device enabled reporting, etc.) that will continue to be improved and supported by Oracle.
Pre-Built Dimensions
– Lower change management impacts and a common production support platform for administrators and users that already use Oracle Hyperion suite of products.
– Usually would be installed in the same environment as the Controller & Financial Reporting organization’s HFM application.
– Likely to be more integrated over time with a company’s broader financial system architecture (ERP, BI/DW, MDM, Other EPM; Oracle, SAP, etc.)
Hyperion-Tax-Provision---Oracle-presentation-3 (1)
The HTP key features:
– HTP is a calculation engine that addresses US GAAP, IFRS, and statutory tax reporting
Hyperion-Tax-Provision---Oracle-presentation-1 (1)
– An easy-to-use wizard for automation of tax differences
– A comprehensive suite of tax reports for the consolidated tax provision – including the tax disclosure
– Reconciliation of the tax accounts to the accounting data, and generation of the tax journal entry
– Powerful reporting tools to support analysis and tax planning in a single solution
– A return-to-accrual process that integrates and enhances the tax compliance process
HTP - Overview
– Configurable supplemental schedules to address unique data collection and specific calculation requirements
– A web-based tax package and integrated workflow for global data collection
– A tax provision solution that is based on the same technology trusted by thousands of corporate finance organizations

The HTP key benefits:
– HTP offers stronger tax risk management through a unified control framework
– Improves efficiency through integration of data / metadata with source ERP and financial consolidation systems
Hyperion-Tax-Provision---Oracle-presentation-2
– Improves efficiency through standardization
– Allows tax to spend more time on analysis and planning instead of reconciliation
– Transparency and auditability from consolidated tax disclosure down to source transactions
– Greater visibility to late adjustments
– Leverages your organization’s investment in Oracle Hyperion Financial Management

Currency Translation in Hyperion Planning – A Custom Approach

$
0
0

It has become a common requirement in multi-national organizations to collect budgets and forecasts in local currency and expect automated currency translation as part of their Hyperion Planning application. This is for good reason: the operating units don’t want the hassle of translating to reporting currencies, corporate FP&A needs to report group and consolidated forecasts in one or more reporting currencies, and management needs the ability to easily perform constant rate and flex analysis based upon exchange rate assumptions. Configured properly, Hyperion Planning can eliminate all of the routine currency calculations that are buried in disconnected Excel-based forecasts today.

Hyperion Planning comes delivered with an “in-the-box” currency translation approach, and this may be preferred when integration with Hyperion Financial Management or other Hyperion Planning modules is required. Often the delivered approach is passed over in favor of a custom solution. Some of the key differences between the delivered and custom translation approaches are:

• The delivered approach requires two dimensions: Currency and HSP_Rates, whereas the custom solution only requires a Currency dimension and a handful of exchange rate accounts. This eliminates an unnecessary dimension and can improve aggregation performance dramatically.
• Under the delivered approach, Planning administrators must redeploy the Planning application for any FX rate changes to take effect. Also, currency conversion scripts must be recreated and redeployed after any changes to the Planning outline. A custom solution does not requires any redeploys.
• Redeployment under the delivered approach requires administrator privileges, which can sometimes delay the velocity of updates and reforecasts.

A custom currency translation solution starts with creating a user defined Currency dimension at the time the application is created. The Currency dimension stores all the various local and reporting currencies along with an additional member named “Local Currency”. Planning data is stored for each entity at their base level in Local Currency. The FX rates are stored directly in the Essbase Account dimension, therefore redeployment of the application is not needed for FX rate changes to take effect.

The local currency of each entity is determined by a set of User Defined Attributes (UDA) (such as AED, ARS, AUD, etc.). These are applied on the base members of the Entity dimension. Another set of UDA are associated with the base members of the Account dimension to identify the currency translation method (namely AVG, EOM, etc.). Finally, a custom business rule performs currency conversion using the FX Rates from local currency to USD. A condensed version of the business rule is provided below.

Currency Translation Rules

The decision to go with delivered or custom currency translation is a key design decision and many factors must be considered: number and type of Planning modules (Core, Capex, Workforce, PFP), number of dimensions, administrative personnel and workflow, inbound/outbound interfaces, the list goes on.

At the very least, I hope this helps explain an alternative to the delivered currency approach. If you have any questions about currency translation in your Hyperion Planning environment, feel free to drop me a line at manooj.thomas@perficient.com.

 

Using Planning and Budgeting Cloud Service Simplified Interface

$
0
0

“It looks so cool!” – That was my first impression of the Oracle Planning and Budgeting Cloud Service (PBCS) simplified interface. The standard interface has proven easy enough to navigate, but now the simplified interface is even more intuitive and efficient to use. I trust most customers won’t have a steep learning curve when interacting with the simplified interface and the new features of the PBCS 11.1.2.4 release.

This new, simplified interface provides not only a very intuitive user experience, but also an overview for quick access to commonly used functions; enabling administrators to monitor planning processes at their fingertips. By using the simplified interface, the user is able to work on the client without giving up any capabilities, thus improving the server’s performance.

If you just subscribed and started building your own applications on PBCS, a new application creation wizard can help you create a sample, simple, or advanced application without trouble.

PBCS App Create Wizard

PBCS Home PageFor users who already have applications on PBCS, you see the “Simplified Interface” option on the landing page. By simply clicking the “Simplified Interface” icon, you are led to your planning application according to your accessibility role.

The simplified interface gives you a look and feel similar to your cell phone but in a desktop version.
PBCS Simplified Interface

Now I will walk you through the features of the simplified interface. Let’s begin by clicking the Dashboards icon on the home page. After clicking the Dashboards icon, an overview of the functionality displays and other icons appear across the top of the screen; similar to the toolbar on your browser.

PBCS Dashboards main page

• Customizable Dashboards help to analyze and visualize data, including the option to have real-time, dynamically displayed, web pages by simply adding URLs. It’s an easy-to-use section allowing the user to drag and drop items like forms, charts and URLs to the position desired within a custom pane/dashboard display (e.g. 4-pane in this example).
PBCS Dashboards - layout

• By clicking Tasks, users can view an overview of all tasks status via a pie chart. Users can then display the steps of a particular process by selecting the applicable task. The Tasks section also helps users plan and prioritize their workload and track the status of their planning process. However, if it is preferred for the task list to be organized in a tree view, users can return to the standard interface at any time. There are several embedded tutorials available for your reference while building processes as well.
PBCS Tasks

• MS Office- like Plans make it easier to manage data by viewing a simple grid, a composite grid and ad-hoc analysis options.
PBCS Plans

Tired of plain spreadsheets? The PBCS simplified interface allows users to define their own report formatting, including the option to provide supporting details related to data.
PBCS Plans - Formatting

• The Rules section lists all rules across all plan types. Users can launch the rules directly or they can go to the Console tab to check the job status.
PBCS Rules

• Under Approvals, users can check the status by sorting by Approvals Status, Current Owner or Planning Unit Name.
PBCS Approvals

• The Reports tab allows users to see and run reports directly.
PBCS Reports

• A centralized Console lets administrators view and manage applications from one place. In this dashboard-like console, users can see an overview of application statistics with an easy-to-use graphical interface as well as easily import/ export/ create dimensions by plan types.
PBCS Console

From the Jobs tab, users can manage and schedule jobs to run or backup their data.
PBCS Console - Jobs

• The Settings functionality allows users to set their preferences and design how the interface looks. Apart from the Application Settings in the image below, users can create announcements on the Announcements tab and display a logo on the home page by inserting the logo URL on the Appearance tab.
PBCS Settings

Academy serves as a comprehensive library for both administrators and users to learn how to manage regular tasks quickly by providing extensive documentation and tutorials. Best practices for designing applications and suggestions for administrating planning for Oracle PBCS are also available for your reference.
PBCS Academy

• In the Navigator section, users see what resources are available. It is a quick guide to redirect you to frequently used functions by hovering your mouse over the function and clicking it.
PBCS Navigator

I hope my brief introduction to the simplified interface on Oracle Planning and Budgeting Cloud Service encourages you to explore more functions within the PBCS solution. If you have any questions about PBCS, please feel free to leave a comment. Thanks.

What’s New for Oracle Hyperion DRM 11.1.2.4 – PBCS Integration

$
0
0

With the recent release of EPM suite 11.1.2.4, it’s no big surprise that Data Relationship Management (DRM) becomes an integration tool used with Planning and Budget Cloud Service (PBCS) as Oracle continues to push cloud computing. Hierarchies, nodes and properties can now be shared between DRM and PBCS by using Web interface or the EPM Automate Utility which enables users to perform import/export tasks remotely.

Some of my clients who have deployed their planning system on PBCS utilized the EPM Automate Utility to automate common administrative tasks. Focusing on master data management, I am excited to see the new release of DRM that enables both on-premise applications, and cloud based applications, shared in the DRM dimensionality.

To perform the DRM-Cloud practice in an automated fashion, planning jobs must be established before using EPM Automate Utility commands; This is a one-time activity performed by administrators. Jobs are actions, such as metadata importing/exporting, data importing/exporting, launching business rules and refreshing database. Jobs can be started immediately or scheduled periodically.

The following example shows the steps required to import metadata from DRM to PBCS. The assumption has been made that a file labeled Account.zip has already been created and in the format.

NOTE: The Account.zip file must be located in the same directory as the upload command.

When populating DRM from a PBCS application, hierarchies, nodes and properties are imported/exported via a .CSV format. The Upload command, uploads the file to the inbox/outbox before the job is executed.

Blog1

NOTE:

  • Job names are case-sensitive.
  • Double quotes are required in the file name if the file name contains spaces.

The PBCS Simplified Interface is an easy approach to perform DRM-Cloud practices in a web interface fashion, as well as create administrative tasks.

Metadata importing/exporting jobs can also be created using the centralized administration console as indicated in the picture below.

blog1-2

Jobs can be run immediately, or scheduled periodically, from the administration console. For more details, please head over to Using PBCS Simplified Interface.

Blog1-3

In my next blog, I will discuss the enhancements to the Data Relationship Governance (DRG) module in the 11.1.2.4 release. Thank you!

Migrating On-Premises Planning Application to PBCS

$
0
0

PBCS_Migration

 

Are you currently thinking about migrating your existing on-premises Hyperion Planning application to the Cloud? Are you having reservations about what this process may entail? Well put your reservations aside! Oracle Planning and Budgeting Cloud Service (PBCS) makes the migration easy as it essentially has the same code base as on-premises Hyperion Planning release 11.1.2.3. In this blog I will give a brief introduction to the basic steps of using the PBCS Application Management (LCM) to migrate an on-premises Planning application to the Oracle Planning and Budgeting Cloud Service.

Prerequisites:

  1. The first release of PBCS supports an upgrade path from R 11.1.2.1 directly into a Cloud based application. The immediate roadmap includes adding support for upgrades from R 11.1.2.2 and then R 11.1.2.3 to PBCS.
  2. Ensure that the current on-premises Planning application is stable and without any cube refresh errors or invalid rules.
  3. The Service Administrator role is required in PBCS in order to perform the migration.
  4. The LCM Administrator role and Administrator role are required in the on-premises EPM system.
  5. The target application name in PBCS should be same as the on-premises application name.
  6. Ensure that no application currently exists in PBCS.

Artifacts Not Supported:

  1. Shared Service custom roles
  2. Reporting and Analysis Annotation and Batch Jobs
  3. Essbase global substitution variables. Global substitution variables have to be converted into application-specific variables before migration
  4. Workspace Pages and Personal Pages
  5. Essbase report scripts and rules (rul.) files

 

Migration Steps:

  • Migrate the Security Model:
    • Identify on-premises EPM System users and groups. Generate a provisioning report that contains the information of users and groups provisioned for the on-premises Planning application that is being migrated.4
    • Use the provisioning report to identify users who should be allowed access to the service. Use the text editor to create a comma-separated user upload file, users.csv, with the following format: 11
    • Log into the on-premises Planning application Shared Services as the Administrator and export the Native Directory as groups.csv.
    • Use a text editor to open the groups.csv file and delete information for groups that are not used to control access to Foundation Services, Planning, Enterprise Resource Planning Integrator, and Reporting and Analysis artifacts.
    • Add information pertaining to the external groups that are used to grant access to Oracle Hyperion Foundation Services, Planning, Enterprise Resource Planning Integrator, and Reporting and Analysis artifacts.
    • Use a text editor to create comma-separated files for those users being assigned to each PBCS role: viewers.csv, planners.csv, powerusers.csv, admins.csv.
    • Log into Oracle Cloud My Services and import the following files: users.csv, viewers.csv, planners.csv, powerusers.csv and admins.csv.15
    • Log into Oracle Planning and Budgeting Cloud Service Application Management to import the groups.csv file.6
  • Export Artifacts from the On-Premise Deployment
    • Launch the Shared Service Console in the on-premises deployment and export the following artifacts:
      • Foundation – Calculation Manager
      • Planning – Planning application except Global Artifacts – Report Mappings7
      • Reporting and Analysis – Repository Objects:
        • All Financial Reporting objects associated with the Planning application (Snapshot Report and Snapshot Book do not need to be associated with an application)
        • Any third-party content
        • HRInternal – DataSources
        • UserPOVs for the user that were migrated as part of the security model migration in HRInternal – UserPOV
      • Security
    • Define the migration and specify the folders to the data set. Clear Export with the Job Output export option and execute the migration.NOTE: The PBCS File Transfer Utility supports migrating artifacts between an on-premises environment and a PBCS environment without using the browser.
  • Export Data (Optional)
    • Export data from the EAS console as needed (R 11.1.2.1.x). Use administration services to perform the export.
  • Zip the Exported Artifacts and Upload the Zip Files to the Planning and Budgeting Cloud Service Workspace
    • Navigate to middleware_home/user_projects/epmsystem1/import_export/admin@native directory on the Foundation Service machine in the on-premised deployment.
    • Right-click the export folder and select your zip file equivalent and then select Add to Archive.
    • In the Add to Archive dialog box, right-click the selected folders and set the following information:
      • Change the name of the archive to OnPremisesApplications.
      • In the Archive Format field, select Zip.
      • In the Parameters field, enter cu=on.8
    • Log into PBCS and navigate to Administer, and then Application Management. Right-click on Application Snapshots and select Upload. Browse to the folder containing the zip file and click Finish.9
  • Import Artifacts to Oracle Planning and Budgeting Cloud Service:
    • In PBCS, navigate to Administer, then Application and expand Application Snapshots.
    • Import all products and artifacts in the following order: Reporting and Analysis, Planning, Calculation Manager.
    • If the Migration Status Report shows an error that the Exchange Rate artifacts failed to import, re-import Global Artifacts – Exchange Rate.
  • Manually Migrate Essbase Data to Oracle Planning and Budgeting Cloud Service (R 11.1.2.1.x)
  • Manually Define Business Rules Security if the Calculation Manager Module was used in release 11.1.2.1.x:
    • Start the Administration Service Console and expand Business Rules, then Repository View, and then Rules.
    • Use the following table to map the access privilege for each business rule.10
    • Log into PBCS and navigate to Administration, then Manage, and then Business Rule Security.
    • Select each business rule that was migrated and manually assign its user/group privileges.
  • Check the Migration Status Reports and Validate the Application.

5 Tips on Administering Applications in Oracle PBCS

$
0
0

Have you ever wondered how much time you really spend on routine Planning admin tasks? Sometimes, a little trick can save you a lot of time with administrative chores. Here are my 5 favorite tips that you can use to simplify administrative tasks, and ensure that your PBCS application performs without a hitch.

#1 – Use the Console functionality on Simplified Interface

Console on Simplified Interface

  • From the Administration toolbar, metadata can be exported and imported, using the Export and Import functionality.
  • Refresh the database from the ‘Application’ menu.
  • By selecting “Admin Tools” from the landing page, applications can be remove and maintenance times scheduled.

Note: Administrators can execute these tasks on the standard interface; however, as I shared in my previous blog (Using PBCS Simplified Interface), the Console section integrates the most frequently used administrative tasks along with new functionality, significantly simplifying the administrator’s work flow as tasks can be performed from one centralized location.

Under Console, there are two tabs: ‘Application’ and ‘Jobs’:

Application: import/ export metadata, refresh the database, remove applications, schedule maintenance times and access the planning inbox/outbox where files can be downloaded/uploaded.  From the Application tab, the following functions are available:

  • Overview – quick application statistics in a graphical format
  • Plan Types – a simple wizard to create plan types
  • Dimensions – a single place to create, import, and export dimensions

Jobs: schedule jobs (i.e. refreshing a database and importing/exporting data/ metadata) that launch at the time scheduled, and view recent job activities.

# 2 – Run the Application Monitor to Improve Performance

Often, it’s puzzling why it takes so long to launch a business rule or refresh the database, when other jobs take only a few seconds. Is the connection to the cloud server down? Is it happening to one specific user or job?

To resolve the performance problem, “Application Monitor” is a useful tool that helps administrators quickly diagnose performance issues by artifacts. By selecting the artifact you’d like to evaluate and clicking ‘run Application Monitor’, a pie chart depicting the performance summary is displayed. The summary shows the performance status of each artifact with suggestions for improvements.

In the example below, after I ran the business rules diagnosis, the status report displays whether each business rule is in good, fair or poor condition For those business rules marked with red light, you can click the ‘View Detail’ icons and view suggestions that can improve performance.

Application Monitor

 

When clicking on “Run Diagnostics” in ‘Grid Diagnostics’,  it provides not only a diagnostics summary but also more details of forms, such as load time, and unsuppressed Rows/Columns.

Grid Diagnostics

After modifying the artifacts to optimize performance, the issue could very well be resolved. If not, you will have at least performed the preliminary steps before creating a service request with Oracle Support.

# 3 – Batch Delete for Aliases, Formulas, UDAs, Smart Lists, and Other Attributes

Sometimes, you don’t mind spending a few minutes to manually delete some UDAs or formulas to change metadata one by one. But what if you needed to delete more than a few UDAs or member formulas, say like over 100 UDAs? You can either edit metadata one at a time in dimensions, or import the updated *csv file to update metadata. Taking the second approach, you might even choose to set the UDA column value to null. But, it doesn’t work to replace values as you would expect.  Use the value <none> instead. This tip is also valid for on-premise applications.

In the example below, all level 0 members are tagged with ‘UDA_1’. Now, you don’t want this UDA attached to account 100, 102 and 110. You just need to replace UDA_1 with <none> in the *csv file and import the updated file. Voila! The UDA_1 tags for account 100, 102and 110 disappear.

Delete UDAs

# 4 – Utilize Maintenance Snapshots to Restore Artifacts

In the case of catastrophic failure, Oracle backs up the snapshots of the service instance during the maintenance window every day.

The maintenance schedule is controlled using the dialog below:

Maintenance Time

If the database fails for some reason, you can use the artifact snapshots to restore the database’s contents to the previous day or to a previously known state.

Download Artifact SnapshotTo download application snapshots, right-click the Artifact Snapshot and download them to a local server.

 

*Note that it’s recommended to save the snapshots to a local server on a daily basis.

 

To restore application artifacts, simply right-click the selected artifacts and select import to return the database contents to their previous state.

Application snapshots

# 5 – Use the EPM Automate Utility for Master Data Management

The fifth and final tip is really a no-brainer. Automate master data loads!  EPM Automate Utility does just that. Administrators can set the process up once and then EPM Automate Utility takes care of the rest. The EPM automate Utility allows you to automate a lot of administrative routines as follows.

  • Import/export metadata/ application data
  • Refresh the application
  • Run business rules
  • Copy data from one database to another
  • Manage files in Oracle PBCS repository, such as upload, download, list and delete files
  • Export/ import application and artifact snapshots using Application Management

I hope these five tips will be of some value to administrators looking to simplify administrative tasks, save time, and help manage applications more efficiently.

Consolidating on-premise and public cloud data on Oracle BI Apps

$
0
0

With an increase in the adoption of cloud applications by large corporations, most organizations today are in some form of hybrid state i.e. they are using a combination of both on-premise as well as cloud applications to run their business.

Regardless of where the applications maintain their data, organizations need the ability to see a complete view of the company spanning across different parts of the business, which in this case would be combining insightful data across on-premise as well as public cloud instances.

hand-grabbing-icons-shutterstock_wordpress

To take some examples:

  • Combining HR and Financials data to analyze Revenue per employee
  • Combining Sales and Financials data to create customer Scorecards
  • Combining Sales and Order Management data to improve your demand planning system
  • Combining your Sales and Financials Data for Forecasting

 

In this article, I would like to present multiple design approaches that other organizations have successfully used to consolidate data from multiple cloud and on-premise applications and to perform seamless analytics across these varied data sources.

If you are attending Collaborate15 (#C15LV), please join me to discuss this topic and case studies of what other organizations are doing to address this challenge.

What is Cross Functional analysis?

Cross functional analysis is the ability to gather valuable business insights by efficiently joining both CRM and ERP data that are used to run front-office and back-office applications for an organization.

Shown in Figure 1 below is an example of cross functional analysis used to build a “Quarterly Revenue Forecast Report” for a Services Company that has three Business Units and Runs Oracle Cloud to manage its CRM and Oracle EBS to manage its Financial Processes.

5

Figure 1

Data Consolidation Options

 There are multiple options you could consider to fulfill cross functional analysis by pulling information from a variety of source systems. Here, I would like to discuss the following two options:

  1. Consolidating all your insightful data on Oracle Business Analytics warehouse and using on-premise OBIEE to perform analysis
  2. Consolidating your data on Oracle Cloud Database schema and using Oracle BI Cloud Service (BICS) to perform analysis

The top half of Figure 2 below shows multiple cloud applications. The top lefts shows various cloud applications where data is generated and hosted. The top right shows the current Oracle cloud analytics platform that can be used to perform reporting.

Likewise, the bottom left shows multiple on-premise applications that create data on an on-premise instance. The bottom right shows OBI Application being used to perform reporting.

2

Figure 2

Option 1 – Consolidating all your insightful data on Oracle Business Analytics warehouse and using on-premise OBIEE to perform analysis

 If you’re using all on-premise applications to run your business, you can use your existing Oracle BI Applications ETL process to extend into non-oracle data sources.

If you’re in a Hybrid state using some cloud applications to run certain parts of your business, you can use cloud ETL (Extract, Transform and Load) tools to move your data onto on-premise staging tables, followed by Oracle’s prebuilt adaptor to push the data onto corresponding Star Schemas.

You might want to choose option 1 to move your public cloud data (Salesforce, NetSuite, Oracle Applications cloud) on to Oracle Business Analytics warehouse (OBAW) if you have already deployed Oracle BI Application and are using its Data Model to consolidate to non-Oracle data sources.

Data Movement Technology

There are number of technologies you can use to move your public cloud data to an on-premise Data Warehouse staging table. Oracle has recently released a number of technologies and approaches that can be used for this purpose. The technology to move data between Cloud and On-prem instances is constantly evolving.

As shown in Figure 3 below, in order to consolidate the data into an on-premise Oracle Business Analytics Warehouse,  we would continue to use Oracle’s pre built adaptor to go on to oracle ERP sources, and use one of the newly released Data Movement technologies to move data from cloud to a common staging table.

3

 Figure 3

Design Considerations 

  1. Use Staging tables to consolidate data from multiple applications(on-premise /public cloud)
  2. Leverage BI Apps Data Model
  3. Leverage Common dimensions for analysis across applications
  4. Assign Unique Data Source Number for each distinct applications

Option 2 – Consolidating your data on Oracle Cloud Database schema and using Oracle BI Cloud Service (BICS) to perform analysis

If you are considering cloud to build your enterprise analytics platform, there are number of Data Movement technologies that can be used to consolidate your data on Oracle’s cloud based database schema service.

The approach here would be similar to the one discussed in Option 1, the difference being, we will consolidate the data on Oracle Cloud based Data Base schema Service.

 

4

 Figure 4

Cloud technology is the next critical evolutionary step in software. It’s a trend that has momentum to become the dominant platform of the future. If you are interested in learning more about why you should consider cloud to build your enterprise analytics platform, you can view my earlier post here  on modernizing your analytics platform on Oracle BI Cloud Service.

Infographic: U.S. Data Consumption Growth

$
0
0

According to a study from the University of California at San Diego, U.S. data consumption per person in 2015 has risen to 8.75 zettabytes annually, equal to a daily consumption of nine DVDs per person per day.

Source: University of California at San Diego

Source: University of California at San Diego

(The reference works for me, although I think some kids today look at DVDs the way I look at eight track tapes.)

In addition, this year Americans will spend 35.2 billion hours on Facebook. (The kids like Facebook, as long as it’s viewed on a mobile device.)

But consumers aren’t just eating up content, they’re creating it as well. And that online content is an overflowing reservoir of information that businesses must tap deeply and efficiently. Millenials alone are a goldmine of potential business intelligence, and have a combined purchasing power of $2.45 trillion worldwide, according to Social Media Today.

How does the successful firm manage to extract valuable, actionable information from this ever-growing data pool? The clear trend is towards cloud-based business intelligence services. A recent survey by Dresner shows that more than 50 percent of businesses either currently use or plan to use cloud-based BI, and that sales and marketing teams make the greatest use of BICS.

For a more in-depth look at managing both cloud and hybrid models of business intelligence, register for our upcoming webinar, Actionable Data: Mastering the Hybrid Analytics Mix.

IT Spending Trends for 2015: The Real Story is Revenue

$
0
0

 

“I want to be the IT guy who’s out of the IT business.” –CIO at an asset management company.

Earlier this week, I was reviewing Computerworld’s 2015 Tech Spending survey results, published a few months ago. Looking at the numbers again, it’s clear that the story isn’t about specific technologies as much as it is about the evolution of the IT department.

Top Tech Spending Trends for 2015 Source: Computerworld

Top Tech Spending Trends for 2015
Source: Computerworld

Security spending is no surprise. (Would you want to be the person who has to explain, post-incident, why security wasn’t a priority? Me neither.) But the increases in cloud and analytics shouldn’t be surprising either. In-house IT teams have become focused more on revenues and strategic initiatives, and less on problems such as managing infrastructure, now handled more often by experts at outside organizations. Accordingly, 24% of respondents expected to decrease spending on hardware.

The real news, as written in Forbes: “In 2015 IT will take on an increasingly important role growing revenue, profitability, enhancing competitiveness and getting to know customers.” In the past, the primary goal of an IT budget was worker productivity. But here, the number one reason given for increasing spending on cloud computing and business analytics was “to generate new revenue streams and increase existing ones.”

Today’s business leaders will be working and competing with more people like David Dodds, the CIO quoted in Forbes and at the top of this post. Dodds expects to spend about half of this year’s IT budget on cloud-based services, and will let the vendors worry about hardware. “I don’t have to have servers, and [employees can] bring whatever computer they have or I’ll just buy them a Chromebook.”

For an in-depth look at the role of cloud in business analytics, register for our upcoming webinar, Actionable Data: Mastering the Hybrid Analytics Mix.

Digital Disruption: 1960 to 2015 and beyond

$
0
0

“Be the joyful digital disrupter, not the hesitant and befuddled digital disruptee.”
– Bob Evans, SVP Communications, Oracle

Source: Forbes

Source: Forbes

The new computer arrives.

The new computer arrives.

I’m a big fan of the recently concluded series Mad Men. The main theme of the show was societal change: Recognizing what has happened, what is happening, and dealing with it in one’s professional and personal lives. While the most common topics involved the role of women in the workplace, office politics, and personal reinvention, technological change was present as well. In the very first episode, a new employee receives the latest state-of-the-art tool for the workplace: an IBM Selectric typewriter. In the final season, many employees feel threatened when their firm purchases an IBM System/360 computer to help place TV ads.

Earlier this year, Bob Evans published the quote (top of the page) as the very first point in his “Top Ten Strategic Issues for CIOs in 2015.” (More on other points in later posts.) His colleague, Oracle CIO Mark Sunday, spoke about need for leadership in this area: “What big data, cloud, mobile, and social have enabled is the fundamental rethinking of business. Industries we never thought would be disrupted are being totally reinvented by leveraging the digital transformation. Because every industry is being disrupted there are great opportunities to not only reinvent the business you’re in but move into new market segments or enable inorganic growth into areas that previously weren’t available. The CIO needs to be out in the front of that.”

Even if the CIO is out in the front, though, there’s a need for leadership throughout the organization. Executives in finance, marketing, and other departments will need to understand that if they’re not making digital transformation a priority, some of their competitors are, affecting the management of everything from regulatory compliance to human capital.

TV shows will come and go, but not digital disruption. It’s the real thing.

OBIEE 11.1.1.9.0: What You Need to Know

$
0
0

3-middlewareOracle BI EE 11g Release 1 (11.1.1.9)

As you might know, Oracle has released OBIEE 11.1.1.9.0.   There are a host of new features that I know some of my clients have been looking for; including the ability to save calculated fields for reuse in other analyses, enhanced export functionality, and Hyperion Planning data source support.  We are installing 11.1.1.9 at one of our clients right now and will let you know our impressions in the next couple of weeks.  The reason we are moving to 11.1.1.9 is due to some important changes to the Certification Matrix.

WindowsServer2012R2_small         SQLServer2012Small

Updated Operating System Support

  1. Windows 2012 R2: BI Server and BI Admin Client Support
  2. Windows 8.1 64-bit: BI Server (dev use only) and BI Admin Client Support
  3. 32-bit OS no longer supported

NOTE: Windows 2012 R2 and Windows 8.1 64-bit requires JDK/JRE 1.7.0_80+.

Expanded Database support for the OBIEE RCU Repository

  1. MS SQL Server 2012 SP1 (Native Drivers recommended)

Expanded Microsoft Office Support

  1. Microsoft Office 2013 is now supported for the OBI Add-In

Newer versions of Browsers are now required for support

  1. Apple Safari 7
  2. Apple Safari 8
  3. Firefox 31+
  4. Google Chrome 42+
  5. MS Internet Explorer 11

For more information see links below

Implementing Oracle Sales Cloud – Key Considerations Part 1

$
0
0

Getting it right the first time. When choosing and rolling out a new solution, a few key steps can help you avoid roadblocks, waste, and frustration. To leverage the strengths of the new tool and increase user adoption, the first step is making the SaaS vs. Enterprise Decision, and choosing the right cloud products.

SaaS vs. Enterprise

Choosing a SaaS (Software as a Service) solution versus a Packaged / Enterprise offering was a bigger consideration in the past. This decision is a foregone conclusion if you have already chosen to go with Oracle Sales Cloud.   If you are still in the consideration phase, many software vendors have addressed issues that had prevented companies from adopting to a SaaS (cloud-based) solution.

 

2015-06-01_12-10-25

There are several advantages to choosing a SaaS (AKA Cloud) solution that have led companies to choose this type of software delivery model over the traditional Enterprise software implementation.

  1. Low cost of ownership
  2. Reduced time to deployment via rapid prototyping
  3. Avoiding upfront capital expenditures (pay as you go)
  4. The SaaS vendor is responsible for upgrades, uptime and security
  5. Scalability – SaaS platforms are very scalable and take care of increased infrastructure needed to scale
  6. Access from anywhere – since the software is in the “cloud”, it is accessible from anywhere there is internet access
  7. Ease of integration – SaaS enables integration with other applications easily. Some are pre-built, others are easy to develop with an array of API/Web services.

However, there are factors still that warrant some consideration when evaluating SaaS vs. Enterprise.

The first: data security. Many SaaS vendors, including Oracle, have made several improvements to address this concern, as it used to be a common deterrent in choosing a SaaS solution. If you belong to an industry where there are stringent data security rules or specific regulations / requirement on how certain data needs to be handled, it would be worth evaluating the data security mechanism and certifications needed with Oracle specifically.

The other consideration is integrations. This is usually more of a factor in migrating an existing enterprise SFA solution to a SaaS solution. Even though integrations are much easier with SaaS solutions, there is a cost to the implementation, which will likely go up as the number of integrations goes up. Thus for an existing SFA application, the cost of conversion, when there are a number of integrations already built in, is a considerable factor.   That said, the cost of maintenance of these interfaces will be lower with a SaaS solution and would mitigate the cost of conversion to some extent.

Cloud as Part of a Complete Solution

Sometimes, solutions like Oracle Sales Cloud are a part of suite of Customer Relationship Management (CRM) /Customer Experience (CX) Management products.  This point matters because often, sales automation is only one of the aspects of CRM/CX that you are looking to address. Keep in mind there are other products that are integrated into sales automation solution that handle other aspects of the CX better. Please keep this mind when choosing the right product set for you.  This is also a key consideration to keep in back of your mind when looking at developing customizations.  Always explore to see if the function you are looking to develop custom is already being done by another Oracle CX product suite.   There might be a separate product that is most suitable in addressing the functionality you desire, especially if significant customization is needed.


Oracle Service Bus 12c JCA Transport Endpoint Error

$
0
0

When an OSB 11g project using the JCA transport is imported to JDeveloper 12c, you might see the following error.

oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.thirdparty.apache.wsif.WSIFException: Please specify a Service. Choices are:

This is one of the known issues in the project migration. You need to manually edit certain files to resolve this error. Here are the steps to resolve this issue.

1. Edit the WSDL associated with Business Service
Open the project in JDeveloper 12c and go to the Business Service. You will find the name of the WSDL under the General Configuration. In the following example, you can see the name of the wsdl is WritePubEasyProductUploadFile.wsdl

JCA01

2. Change the namespace
Change the targetNameSpace, wpefile and the import namespace to http://xmlns.oracle.com/pcbpel/adapter

JCA02

Open the Abstract WSDL. You can find the name of the Abstract WSDL in the location attribute of the WSDL import tag. In the above screenshot you can see that the Abstract WSDL is WritePubEasyProductUploadFile_Abstract.wsdl. Open this abstract WSDL and edit the targetNamespace and the wpefile attributes in the WSDL definition tag to http://xmlns.oracle.com/pcbpel/adapter. Remove the xmlns:jca=http://xmlns.oracle.com/pcbpel/wsdl/jca/ line.

JCA03

3. Edit the Business Service
Open the Business service in notepad. In this example, WritePubEasyProductUploadFile.bix and change the namespace to http://xmlns.oracle.com/pcbpel/adapter

JCA04


4. Refresh the Application
Refresh the application or close and reopen the project. If it does not work, click on the Business Service and refresh it. Now, the error associated with the JCA will be resolved.

JCA05

Two-Minute Video: Customer Success with Oracle Sales Cloud

OBIEE 11.1.1.9 Unable to Login with MSAD / LDAP (virtualize=true)

$
0
0

I would like to follow up on John Whitaker’s post, OBIEE 11.1.1.9.0: What You Need to Know, by adding a note about an important patch that was released on June 3, 2015.

Using an external security provider with Oracle Business Intelligence is increasingly common. When configuring Microsoft Active Directory, LDAP, or other 3rd party authentication provider it is common to leave the DefaultAuthenticator provider enabled. Even if this is not the desired approach for the production environment, it is often desirable in development and/or test environments. The foundation of this functionality is setting the “virtualize” parameter to “true” in the Identity Store options within Enterprise Manager.

Setting “virtualize=true” in Oracle BI 11.1.1.9 will prevent users from logging in. Further information is available on Oracle Support.

OBIEE 11g: Alert: Users Unable to Log in to OBIEE 11.1.1.9.0 if Using MSAD or Other Third-Party LDAP as the Identity Store and Virtualization is Set to true. (Doc ID 2016571.1)

Fortunately, a patch is already available.

Patch 20188679: AUTHENTICATION FAILS AGAINST 3RD PARTY LDAP WHEN VIRTUALIZE=TRUE SET

APPCRASH During Install of Oracle BI on Microsoft Windows 2012 R2

$
0
0

Installing any commercial software on a newly certified operating system often leads to a journey of discovery. In this case, Oracle Business Intelligence 11.1.1.9 installed on Microsoft Windows 2012 R2 raises an alarming error during the installation process.

APPCRASH of SAWSERVER.EXE during install of Oracle BI 11.1.1.9 on Microsoft Windows 2012 R2

The crash warning above occurred during an Enterprise Install on a freshly built Microsoft Windows 2012 R2 server. After some digging on Oracle Support I ran across the following Bug Report

Bug 20597365 : SOFTWARE ONLY INSTALLATION OF 97% OCCUR C0000135/SAWSERVER.EXE CRASH ON WIN2012

Fortunately, according to the following support document, the installation process was successful and the error can be safely ignored. This defect and related support document are specific to Oracle BI 11.1.1.7, but I believe that will be updated to include the latest version once Oracle support catches up with the new release.

OBIEE 11g – APPCRASH at 97% of Software Only Installation on Windows Server 2012 64-Bit (Doc ID 1996077.1)

OBIEE 11.1.1.9 – BI Publisher Set Up Fails on Windows Server 2012

$
0
0

The journey of discovery continues. I can report a successful install of Oracle Business Intelligence 11.1.1.9 on Microsoft Windows 2012 R2, but I encountered one last bug and set of supplemental instructions worth sharing.

During the Configuration Progress step of the Enterprise Install the “Setting Up BI Publisher” step never completed. From the screenshot below you can see that I let the installer run for quite a while, over 90 minutes, and still no progress. This step normally takes less than a minute so something was definitely wrong.

BI Publisher Install Hang

I opened the log file shown in the screenshot above, but it didn’t include any detail. Fortunately, the related .OUT file (same file name as the .LOG file, but with different extension) held the details about the problem

updateBIPConfigFiles: Problem invoking WLST - Traceback (innermost last):
 updateBIPConfigFiles: File "D:\Middleware\Oracle_BI1\bifoundation\install\updateBIPConfigFiles.py", line 15, in ?
 updateBIPConfigFiles: File "D:\Middleware\wlserver_10.3\common\wlst\modules\jython-modules.jar\Lib/javaos$py.class", line 334, in system
 updateBIPConfigFiles: File "D:\Middleware\wlserver_10.3\common\wlst\modules\jython-modules.jar\Lib/popen2.py", line 235, in system
 updateBIPConfigFiles: File "D:\Middleware\wlserver_10.3\common\wlst\modules\jython-modules.jar\Lib/popen2.py", line 72, in __init__
 updateBIPConfigFiles: File "D:\Middleware\wlserver_10.3\common\wlst\modules\jython-modules.jar\Lib/javashell.py", line 64, in execute
 updateBIPConfigFiles: OSError: (0, 'Failed to execute command ([\'sh\', \'-c\', \'java -classpath D:\\\\Middleware\\\\Oracle_BI1/clients/bipublisher/xdo-server.jar oracle.xdo.install.UpdateConfigFiles 9704 9703 9710 jdbc/mds/owsm D:\\\\Middleware\\\\user_projects\\\\domains\\\\bifoundation_domain\']): java.io.IOException: Cannot run program "sh": CreateProcess error=2, The system cannot find the file specified')
 updateBIPConfigFiles: 
 java.lang.Exception: WLST Script task failed with status 1

After some digging on Oracle Support I ran across the following Bug Report.

Bug 21187922 : OBIEE 11.1.1.9 INSTALL FAILED AT CONFIGURATION STEP : WINDOWS SERVER 2012

There is no direct solution to this problem, but the following support document points to a workaround. 

OBIEE 11g: Error: “UpdateBIPConfigFiles: Problem invoking WLST – Traceback (innermost last)” BI Publisher Set Up Fails on Windows Server 2012 when Running the Configuration Assistant (Doc ID 1580583.1)

The solution involves updating the javashell.py file within the jython-modules.jar file. To leverage this fix we have to interrupt the Oracle BI installation, modify the appropriate file, then continue the installation. The detailed process is below.

  1. Download and install Oracle WebLogic Server 11gR1 (10.3.6) Generic and Coherence [V29856-01]
    1. Step 2 requires that Oracle WebLogic Server be installed in advance
    2. Oracle WebLogic Server 10.3.6 is required for Oracle Business Intelligence 11.1.1.9
  2. Perform a Software Only Install of Oracle Business Intelligence 11g (11.1.1.9.0) for Microsoft Windows (64-bit) [V76016-01]
  3. Update the javashell.py file
    1. Open a Command Prompt
    2. Navigate to the [Middleware Home]\wlserver_10.3\common\wlst\modules directory
    3. Extract the javashell.py file with the following command: [JDK Home]/bin/jar xf jython-modules.jar Lib\javashell.py
    4. Use a text editor to modify the [Middleware Home]\wlserver_10.3\common\wlst\modules\Lib\javashell.py file
      1. Search for the first incidence of _osTypeMap in the file
      2. Add ‘Windows Server 2012′ to the map for “nt” right after ‘Windows 7′ (see below for details)
    5. Import the javashell.py file with the following command: [JDK Home]/bin/jar uf jython-modules.jar Lib\javashell.py
  4. Run the Oracle Business Intelligence 11.1.1.9 Configuration Assistant to continue your install as normal

The section of text in javashell.py to modify in 3.4.2 above will look like this when complete. The added text is in bold red.

 _osTypeMap = (
              ( "nt", ( 'nt', 'Windows NT', 'Windows NT 4.0', 'WindowsNT',
                        'Windows 2000', 'Windows 2003', 'Windows XP', 'Windows CE',
                        'Windows Vista', 'Windows Server 2008', 'Windows 7', 'Windows Server 2012' )),

 

Viewing all 934 articles
Browse latest View live