>O2 TECH > Microsoft Office 365 > Ready for Free download

>

Office 365 goes into public beta

by Katie Gatto

 
Office 365 goes into public beta

(PhysOrg.com) — 
Microsoft has released its latest product for business, Office 365, into public beta. The program, which is limited in its participants, allows users to try out the new software in exchange for feedback about the software.

So, what is Office 365?
According to the company, it is next-generation cloud productivity service for businesses of all sizes. What that means is that has brought, Microsoft Office, SharePoint Online, Exchange Online and Lync Online to a constantly updating cloud service. The public beta will allow users in 38 markets and 17 languages to try out the software for the first time. The seems like the service will be mostly geared toward small businesses, since 70 percent of the people who signed up for the limited beta were .
Office 365 has its own app store, allowing user to add to the functionality as they see fit. This store, called The Marketplace, currently has more than 100 apps and 400 professional services available to the users. New apps and services are expected to be added over time.
In conjunction with the public beta of the new service Microsoft has announced a new contest. The contest, named the Office 365: Ready for Work contest, and it is taking place on Microsoft’s Office 365 Facebook page. All business that want to enter have to do is share their experiences with the software on the page. The winner, which will be determined by a community vote, will win free access to Office 365 for a year, $50,000 in advertising and business services, and a Microsoft executive for a day to work at the business or charity of the winners choice.
No word yet on when Office 365 will be out of beta and on sale.

Office 365 promo video

More information: http://www.microso … fice365.mspx
© 2010 PhysOrg.com

Enhanced by Zemanta

>Why Consumer Goods Companies Need Analytics to Compete?

>

Mark A. Smith in his Information Management Blog titled “Consumer Goods Companies Need Analytics to Compete” contests that given the cutthroat competitive landscape among consumer goods manufacturers, from food and beverage to electronics and automotives, capturing the minds and wallet share of customers, by reaching out to them with the right product and right price is no easy task.

Optimizing business efforts from manufacturing to product and throughout the supply chain to customers requires a comprehensive set of tasks that cannot be done without insights on what has happened in the past and where current activities are going. Those event-driven and demand-driven insights can come through analytics that assess the small but important details of pricing, trade promotions and processes in the supply chain to keep retailers satisfied with inventory levels.


Retailers themselves are learning how to use analytics, as I recently pointed out (See: “Analytics in Retail: An Operational and Financial Mandate“); and to keep up, manufacturers must be smarter in their sales teams and the people who manage the sales and operational plans that commit capital and resources of the organization. In addition, Finance is not just along for the ride any more as it looks to get more engaged in the effectiveness of spending and balancing financial resources to meet the margin targets for the quarter. This can be accomplished through costing and profitability analytics that examine product and marketing investments across the business.
These days it is insufficient to apply analytics only to data representing historical activities; also needed is predictive analytics based on models and variables that can provide forward-looking projections of what will likely happen based on the planned activities. Organizations need to know how much and when to invest in marketing and product activities where pricing and promotion can generate only limited benefits in this busy marketplace of ever more competitors and changing economic conditions. All these demands are driving many consumer goods manufacturers to build new competencies in business analyst teams and seek tools and technology to apply analytics most effectively. Analysts then can cross lines of business to work more closely together than just focusing on their individual department’s functions.
It’s clear from all this activity that the consumer goods industry is in flux, forced to mature in its processes and utilize technology to its fullest. To optimize consumer brand recognition and profitability, companies must re-examine their current analytic processes and use data to determine if there are faster, better and, yes, cheaper methods to meet never-ending demands from a variety of business areas.
Of course any organization may need to prioritize where you focus improvements in the lines of business, but I advise you to consider that industry-specific solutions from technology providers are likely to solve only some of your needs. Getting away from the silos of spreadsheets and the inefficiency of electronic mail is a must; you want to have an analytic process that is much like your manufacturing process. Working collaboratively across business and IT is one big step in the right direction.

Mark also blogs at VentanaResearch.com/blog.

>Why Consumer Goods Companies Need Analytics to Compete?

>

Mark A. Smith in his Information Management Blog titled “Consumer Goods Companies Need Analytics to Compete” contests that given the cutthroat competitive landscape among consumer goods manufacturers, from food and beverage to electronics and automotives, capturing the minds and wallet share of customers, by reaching out to them with the right product and right price is no easy task.

Optimizing business efforts from manufacturing to product and throughout the supply chain to customers requires a comprehensive set of tasks that cannot be done without insights on what has happened in the past and where current activities are going. Those event-driven and demand-driven insights can come through analytics that assess the small but important details of pricing, trade promotions and processes in the supply chain to keep retailers satisfied with inventory levels.


Retailers themselves are learning how to use analytics, as I recently pointed out (See: “Analytics in Retail: An Operational and Financial Mandate“); and to keep up, manufacturers must be smarter in their sales teams and the people who manage the sales and operational plans that commit capital and resources of the organization. In addition, Finance is not just along for the ride any more as it looks to get more engaged in the effectiveness of spending and balancing financial resources to meet the margin targets for the quarter. This can be accomplished through costing and profitability analytics that examine product and marketing investments across the business.
These days it is insufficient to apply analytics only to data representing historical activities; also needed is predictive analytics based on models and variables that can provide forward-looking projections of what will likely happen based on the planned activities. Organizations need to know how much and when to invest in marketing and product activities where pricing and promotion can generate only limited benefits in this busy marketplace of ever more competitors and changing economic conditions. All these demands are driving many consumer goods manufacturers to build new competencies in business analyst teams and seek tools and technology to apply analytics most effectively. Analysts then can cross lines of business to work more closely together than just focusing on their individual department’s functions.
It’s clear from all this activity that the consumer goods industry is in flux, forced to mature in its processes and utilize technology to its fullest. To optimize consumer brand recognition and profitability, companies must re-examine their current analytic processes and use data to determine if there are faster, better and, yes, cheaper methods to meet never-ending demands from a variety of business areas.
Of course any organization may need to prioritize where you focus improvements in the lines of business, but I advise you to consider that industry-specific solutions from technology providers are likely to solve only some of your needs. Getting away from the silos of spreadsheets and the inefficiency of electronic mail is a must; you want to have an analytic process that is much like your manufacturing process. Working collaboratively across business and IT is one big step in the right direction.

Mark also blogs at VentanaResearch.com/blog.

>Analytics on Analytics: Is Decision Management, the Last Frontier in BI?

>

 

Boris Evelson in his Information Management Blog / Forrester Muse titled “Decision Management, Possibly the Last Frontier in BI” leans on an excellent article on the subject by Tom Davenport and reports on Forrester Research and the trend for ‘Thinking Ahead” companies  to venture into combining reporting and analytics with decision management along the following lines:

  • Automated (machine) vs. non automated (human) decisions, and
  • Decisions that involve structured (rules and workflows) and unstructured (collaboration) processes.

Unfortunately, current best practices and technologies to address these four distinct, but closely related requirements, come from different vendors, technologies and experts.


According to Evelson, Tom Davenport pointed out a challenge.
Which is?

How does one convince a non-analytically oriented CEO that analytics and decision management are vital to enterprise success?

The gap according to Davenport.
 
“There’s a big, big gap between the most analytical and the least analytical. American business has a fair number of CEOs with engineering backgrounds, and they tend to be relatively analytical. At the same time, an awful lot have sales backgrounds, and they’re not analytical at all. Clearly, you could do a lot of analytics with sales, but people don’t generally go into sales because they like numbers. Executives with legal backgrounds also don’t tend to be very quantitative in their decision approaches.”

Evelson explains that analytics and decision management are very hard (but possible) to build a business case around, with a concrete, tangible ROI, using competitive BI benchmarks.

And he concludes by pointing out that, analytics on analytics – or understanding when, who, and how analytics are used in an enterprise, and potentially correlating usage of analytics to decisions, good or bad – is also one of the emerging best practices.
 
Boris also blogs at http://blogs.forrester.com/boris_evelson/.

>Analytics on Analytics: Is Decision Management, the Last Frontier in BI?

>

 

Boris Evelson in his Information Management Blog / Forrester Muse titled “Decision Management, Possibly the Last Frontier in BI” leans on an excellent article on the subject by Tom Davenport and reports on Forrester Research and the trend for ‘Thinking Ahead” companies  to venture into combining reporting and analytics with decision management along the following lines:

  • Automated (machine) vs. non automated (human) decisions, and
  • Decisions that involve structured (rules and workflows) and unstructured (collaboration) processes.

Unfortunately, current best practices and technologies to address these four distinct, but closely related requirements, come from different vendors, technologies and experts.


According to Evelson, Tom Davenport pointed out a challenge.
Which is?

How does one convince a non-analytically oriented CEO that analytics and decision management are vital to enterprise success?

The gap according to Davenport.
 
“There’s a big, big gap between the most analytical and the least analytical. American business has a fair number of CEOs with engineering backgrounds, and they tend to be relatively analytical. At the same time, an awful lot have sales backgrounds, and they’re not analytical at all. Clearly, you could do a lot of analytics with sales, but people don’t generally go into sales because they like numbers. Executives with legal backgrounds also don’t tend to be very quantitative in their decision approaches.”

Evelson explains that analytics and decision management are very hard (but possible) to build a business case around, with a concrete, tangible ROI, using competitive BI benchmarks.

And he concludes by pointing out that, analytics on analytics – or understanding when, who, and how analytics are used in an enterprise, and potentially correlating usage of analytics to decisions, good or bad – is also one of the emerging best practices.
 
Boris also blogs at http://blogs.forrester.com/boris_evelson/.

>Six Steps to Governing Analytics

>

Six Steps to Governing Analytics

This approach deals directly with behavioral change and the actions to help ensure that value is being achieved

Information Management Magazine, July/Aug 2010

Predictive analytics takes the information made available through descriptive analytics (historical facing) and combines it with more sophisticated statistical modeling, forecasting and optimization techniques to derive insights which help to anticipate the impact on business outcomes. Where are organizations when it comes to leveraging predictive analysis? Research shows that while some organizations may analyze data to predict what might happen in the future in terms of competitor activities, market trends, product/service development, risk management, financial/economic trends and skill requirements, many organizations are still using predictive analytics only to a minor extent, if at all.It’s clear that companies need to move from descriptive analytics (the “what”) to predictive analytics (the “now what?”), from “what happened?” to “what’s the best that can happen?”
During previous economic downturns, companies that thrived used data-derived insights made by informed decision-makers to produce lasting competitive advantage…

Today, some companies have this down to a science, while others are just starting to acquire a better understanding of how they can use predictive capabilities to increase their revenue stream and reduce their costs. For them, it often means implementing analytics across a number of business functions, such as supply chain management, customer acquisition and retention, talent and organizational performance, finance and performance management. These are the core domain areas that companies want to analyze and be more predictive in how they forecast information and optimize their existing capabilities.
Organizations considering analytics across the enterprise should follow a six-step approach to governing their analytics that will take them from the “what” through the “so what” and into the “now what.” This approach deals directly with behavioral change and the actions to help to ensure that value is being achieved.

  1. Identify the key targets and metrics. There has been considerable activity of late surrounding getting data right and making sure the right data is available to make decisions. Identifying key metrics for analytics, though, involves a different set of discussion points. The focus here is on identifying the strategic business issues that can benefit from predictive analytics: How should you be optimizing your supply chain? How should you be forecasting your financial numbers? How can you become more focused on customer acquisition and retention? And what are the metrics that say you’re actually doing a good job in those areas?
  2. Generate insights. This is where the software automation used helps generate a forecasting capability, an optimization engine, a predictive model or pattern recognition. This is the “aha” moment where you say, “Okay, I now recognize that 80 percent of my turnover problem is occurring in customers who’ve been with us less than three months.”
  3. Validate insights. Can I be confident the insight generated is true? Has the data been prepared properly? Have I taken a broad enough selection of data to be able to ensure that the pattern identified is reasonable and accurate? Validating insight is as important as generating it, because once you’ve validated it you can move on to creating programs that will improve business outcomes.
  4. Plan and execute decisions. This is where differentiation in the marketplace occurs. It’s where companies can pull ahead of their competitors because they can now take a proactive step toward managing whatever the insight has told them to do. For example, logistical regression can identify the variables that contribute significantly to turnover in new users. This in turn allows a company to monitor its customers and develop strategies to reach out to at-risk customers before they churn.
  5. Realize value. If you’ve identified customers that are likely to purchase from you a second time, for example, you can now target these individuals and make sure you’re actually achieving some lifts in your sales figures based on this targeted activity.
  6. Monitor performance over time. You want to be able to track over time how well you’re doing as compared to your original baselines. So not only do you have to achieve value, but you have to ensure that it’s occurring in an ongoing way. And being able to monitor – weekly, monthly, quarterly – becomes a huge part of ensuring the value of the analytics and how it’s changing the organization’s bottom line.

This six-step approach provides organizations with a good structure to help clients see the end-to-end picture. Most recently, it helped the Royal Shakespeare Company create a new segmentation model using a number of additional variables for the first time. This identified a “Golden Geese” segment that is more likely to buy expensive tickets, attend a Saturday night performance and book more than three months in advance. Less sporadic in terms of attendance than the regulars, these Golden Geese have become an invaluable segment to market the RSC experience to and a prime target for expanded packages.

Greg Todd is Accenture Analytics, Executive Director – Technology. Todd has been a technical architect for over 17 years, with an increasing focus on the reporting and analysis of information, both structured and unstructured. Through leading many enterprise resource planning implementations, he has seen how critical the management of information is to the bottom line of any corporate strategy.

>Six Steps to Governing Analytics

>

Six Steps to Governing Analytics

This approach deals directly with behavioral change and the actions to help ensure that value is being achieved

Information Management Magazine, July/Aug 2010

Predictive analytics takes the information made available through descriptive analytics (historical facing) and combines it with more sophisticated statistical modeling, forecasting and optimization techniques to derive insights which help to anticipate the impact on business outcomes. Where are organizations when it comes to leveraging predictive analysis? Research shows that while some organizations may analyze data to predict what might happen in the future in terms of competitor activities, market trends, product/service development, risk management, financial/economic trends and skill requirements, many organizations are still using predictive analytics only to a minor extent, if at all.It’s clear that companies need to move from descriptive analytics (the “what”) to predictive analytics (the “now what?”), from “what happened?” to “what’s the best that can happen?”
During previous economic downturns, companies that thrived used data-derived insights made by informed decision-makers to produce lasting competitive advantage…

Today, some companies have this down to a science, while others are just starting to acquire a better understanding of how they can use predictive capabilities to increase their revenue stream and reduce their costs. For them, it often means implementing analytics across a number of business functions, such as supply chain management, customer acquisition and retention, talent and organizational performance, finance and performance management. These are the core domain areas that companies want to analyze and be more predictive in how they forecast information and optimize their existing capabilities.
Organizations considering analytics across the enterprise should follow a six-step approach to governing their analytics that will take them from the “what” through the “so what” and into the “now what.” This approach deals directly with behavioral change and the actions to help to ensure that value is being achieved.

  1. Identify the key targets and metrics. There has been considerable activity of late surrounding getting data right and making sure the right data is available to make decisions. Identifying key metrics for analytics, though, involves a different set of discussion points. The focus here is on identifying the strategic business issues that can benefit from predictive analytics: How should you be optimizing your supply chain? How should you be forecasting your financial numbers? How can you become more focused on customer acquisition and retention? And what are the metrics that say you’re actually doing a good job in those areas?
  2. Generate insights. This is where the software automation used helps generate a forecasting capability, an optimization engine, a predictive model or pattern recognition. This is the “aha” moment where you say, “Okay, I now recognize that 80 percent of my turnover problem is occurring in customers who’ve been with us less than three months.”
  3. Validate insights. Can I be confident the insight generated is true? Has the data been prepared properly? Have I taken a broad enough selection of data to be able to ensure that the pattern identified is reasonable and accurate? Validating insight is as important as generating it, because once you’ve validated it you can move on to creating programs that will improve business outcomes.
  4. Plan and execute decisions. This is where differentiation in the marketplace occurs. It’s where companies can pull ahead of their competitors because they can now take a proactive step toward managing whatever the insight has told them to do. For example, logistical regression can identify the variables that contribute significantly to turnover in new users. This in turn allows a company to monitor its customers and develop strategies to reach out to at-risk customers before they churn.
  5. Realize value. If you’ve identified customers that are likely to purchase from you a second time, for example, you can now target these individuals and make sure you’re actually achieving some lifts in your sales figures based on this targeted activity.
  6. Monitor performance over time. You want to be able to track over time how well you’re doing as compared to your original baselines. So not only do you have to achieve value, but you have to ensure that it’s occurring in an ongoing way. And being able to monitor – weekly, monthly, quarterly – becomes a huge part of ensuring the value of the analytics and how it’s changing the organization’s bottom line.

This six-step approach provides organizations with a good structure to help clients see the end-to-end picture. Most recently, it helped the Royal Shakespeare Company create a new segmentation model using a number of additional variables for the first time. This identified a “Golden Geese” segment that is more likely to buy expensive tickets, attend a Saturday night performance and book more than three months in advance. Less sporadic in terms of attendance than the regulars, these Golden Geese have become an invaluable segment to market the RSC experience to and a prime target for expanded packages.

Greg Todd is Accenture Analytics, Executive Director – Technology. Todd has been a technical architect for over 17 years, with an increasing focus on the reporting and analysis of information, both structured and unstructured. Through leading many enterprise resource planning implementations, he has seen how critical the management of information is to the bottom line of any corporate strategy.

>Data Governance: What "Right" Looks Like

>

There is a proper way to implement data governance, and either it works or it needs improvement

Jane Griffin Information Management Magazine, July/Aug 2010

In school, you got an “A.” In golf, it’s a par. With your sales forecast,it’s a defined set of numbers or a big account won. In most things personal and in business it’s pretty easy to tell when you get it right. With goals that are less familiar you ask yourself, “How do I know if I’m doing this right; how do I know if I’m achieving my goals?” For companies trying to implement data governance programs, those kind of questions get asked – a lot.

Data governance programs are as unique as the companies that implement them. However, the frameworks for data governance programs are actually pretty similar for each implementation. There are certain foundational components on which governance is built. I’m going to briefly describe each component and then describe how that component looks when it’s being properly managed. In other words, we’re going to describe what “right” looks like.

The six components of a data governance framework are:

  • Organization – administers activities and provides a responsible network of resources to deliver governance capabilities.
  • Policies, principles and standards – refers to information management guidelines and principles for enforcing data standards and governance procedures.
  • Processes and practices – establishes guiding principles for how policies and processes are created, modified and implemented.
  • Metrics – establishes measures for monitoring information/governance performance and actions to continually improve enterprise data quality.
  • Data architecture – includes enterprise data standards, a business information model, metadata dictionary, as well as security and privacy measures.
  • Tools and technology – supports common information exchange solutions, workflow and business rules, as well as user presentation.

There is a right way to implement each of these components. It’s not really up for debate – either it works, or it needs improvement. It’s pretty quantifiable.
The first component is organization. When data govrnance is implemented effectively, there is a chief data officer in charge of the effort to provide strategic leadership, executive direction and oversight to drive sustained alignment with business priorities and compliance with regulatory mandates. There are also clearly defined roles and responsibilities, including those for data ownership and stewardship. There are data standards in place that set overarching governance policies and serve as a forum for resolving conflicts.
For the component dealing with policies, principles and standards, we need to see that common data standards have been established and implemented across lines of business for key data subject areas such as customer, contracts, vendors, products, etc. The governance policies and procedures will look clearly defined, broadly communicated and understood throughout the enterprise. This helps reinforce data standards, particularly for the key data subject areas.
Data governance processes and practices will be right when key DG processes (such as those for change requests, compliance and exception reporting, issue resolution, etc.) are clearly defined, established and tested across the enterprise. Areas where business processes are impacted by data governance processes will have been identified and suitable actions will have been taken to address and mitigate those impacts.
When the metrics component is implemented effectively, the metrics will be measuring and providing information on what they’re supposed to measure. Metrics should provide consistent, periodic measurement standards to effectively monitor data quality, compliance with data governance policies and standards, and the overall performance of the governance organization.
When the data architecture component is executed properly, there is a practical, usable data model in place, and all data quality and metadata management requirements have been identified and implemented. The implemented architecture should be extensible, scalable and robust, while being flexible enough to meet the requirements of an ever-changing IT environment. Finally, it should support key line-of-business functions at the enterprise level.
The final component, tools and technology, is effectively implemented when key business, information management and governance processes are automated across LOBs and business functions as much as is practicably possible. The tools and technology should also provide full support for governance support activities such as workflow and content management, as well as DG standards and policies.
The potential benefits that arrive when we do data governance right are enormous, especially in the long term. Efficiency will be improved via access to higher quality data for decision-making. LOBs are better able to coordinate their activities due to standardized processes and access to enterprise-wide data. This can provide substantial cost savings. Savings will also be achieved by reducing the number of IT applications and systems and standardizing the ones that remain.
Capacity will be improved due to better reporting and analytical capabilities arising from improvements to data quality and access. Regulatory compliance can likewise be improved due to standardized processes and data needed to maintain compliance performance. The culture of data ownership and stewardship that an effective data governance program will engender can also help maintain compliance. With benefits and opportunities like these, doesn’t it make you want to really get it right?



Jane Griffin is a Deloitte Consulting LLP partner. Griffin has designed and built business intelligence solutions and data warehouses for clients in numerous industries. She may be reached via e-mail at janegriffin@deloitte.com.  

Deloitte is not, by means of this article, rendering business, financial, investment, or other professional advice or services. This article is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business.

>Data Governance: What "Right" Looks Like

>

There is a proper way to implement data governance, and either it works or it needs improvement

Jane Griffin Information Management Magazine, July/Aug 2010

In school, you got an “A.” In golf, it’s a par. With your sales forecast,it’s a defined set of numbers or a big account won. In most things personal and in business it’s pretty easy to tell when you get it right. With goals that are less familiar you ask yourself, “How do I know if I’m doing this right; how do I know if I’m achieving my goals?” For companies trying to implement data governance programs, those kind of questions get asked – a lot.

Data governance programs are as unique as the companies that implement them. However, the frameworks for data governance programs are actually pretty similar for each implementation. There are certain foundational components on which governance is built. I’m going to briefly describe each component and then describe how that component looks when it’s being properly managed. In other words, we’re going to describe what “right” looks like.

The six components of a data governance framework are:

  • Organization – administers activities and provides a responsible network of resources to deliver governance capabilities.
  • Policies, principles and standards – refers to information management guidelines and principles for enforcing data standards and governance procedures.
  • Processes and practices – establishes guiding principles for how policies and processes are created, modified and implemented.
  • Metrics – establishes measures for monitoring information/governance performance and actions to continually improve enterprise data quality.
  • Data architecture – includes enterprise data standards, a business information model, metadata dictionary, as well as security and privacy measures.
  • Tools and technology – supports common information exchange solutions, workflow and business rules, as well as user presentation.

There is a right way to implement each of these components. It’s not really up for debate – either it works, or it needs improvement. It’s pretty quantifiable.
The first component is organization. When data govrnance is implemented effectively, there is a chief data officer in charge of the effort to provide strategic leadership, executive direction and oversight to drive sustained alignment with business priorities and compliance with regulatory mandates. There are also clearly defined roles and responsibilities, including those for data ownership and stewardship. There are data standards in place that set overarching governance policies and serve as a forum for resolving conflicts.
For the component dealing with policies, principles and standards, we need to see that common data standards have been established and implemented across lines of business for key data subject areas such as customer, contracts, vendors, products, etc. The governance policies and procedures will look clearly defined, broadly communicated and understood throughout the enterprise. This helps reinforce data standards, particularly for the key data subject areas.
Data governance processes and practices will be right when key DG processes (such as those for change requests, compliance and exception reporting, issue resolution, etc.) are clearly defined, established and tested across the enterprise. Areas where business processes are impacted by data governance processes will have been identified and suitable actions will have been taken to address and mitigate those impacts.
When the metrics component is implemented effectively, the metrics will be measuring and providing information on what they’re supposed to measure. Metrics should provide consistent, periodic measurement standards to effectively monitor data quality, compliance with data governance policies and standards, and the overall performance of the governance organization.
When the data architecture component is executed properly, there is a practical, usable data model in place, and all data quality and metadata management requirements have been identified and implemented. The implemented architecture should be extensible, scalable and robust, while being flexible enough to meet the requirements of an ever-changing IT environment. Finally, it should support key line-of-business functions at the enterprise level.
The final component, tools and technology, is effectively implemented when key business, information management and governance processes are automated across LOBs and business functions as much as is practicably possible. The tools and technology should also provide full support for governance support activities such as workflow and content management, as well as DG standards and policies.
The potential benefits that arrive when we do data governance right are enormous, especially in the long term. Efficiency will be improved via access to higher quality data for decision-making. LOBs are better able to coordinate their activities due to standardized processes and access to enterprise-wide data. This can provide substantial cost savings. Savings will also be achieved by reducing the number of IT applications and systems and standardizing the ones that remain.
Capacity will be improved due to better reporting and analytical capabilities arising from improvements to data quality and access. Regulatory compliance can likewise be improved due to standardized processes and data needed to maintain compliance performance. The culture of data ownership and stewardship that an effective data governance program will engender can also help maintain compliance. With benefits and opportunities like these, doesn’t it make you want to really get it right?



Jane Griffin is a Deloitte Consulting LLP partner. Griffin has designed and built business intelligence solutions and data warehouses for clients in numerous industries. She may be reached via e-mail at janegriffin@deloitte.com.  

Deloitte is not, by means of this article, rendering business, financial, investment, or other professional advice or services. This article is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business.