zynbit_1

ZynBit Empowers Sales with Microsoft Power BI Embedded

ZynBit incorporated Power BI Embedded into their offering to drive value to their customers and increase monetization opportunities for sales and marketing teams.

Business Needs

ZynBit connects all of your sales and marketing Bits to help companies transform teams into what we call “Sales Nerds”. The next generation of sales pros are 1-part sales engineer, 1-part customer success and 1-part data analyst. Customer buying behavior is changing – consumers and buyers are heavily researching their vendors way before the first sales interaction ever happens, forcing sales teams to rethink how they operate.

In fact, a recent Corporate Executive Board study shared by the Harvard Business Review stated that, on average, nearly 60% of a typical purchasing decision—researching solutions, ranking options, setting requirements, benchmarking pricing, and so on—happens before even having a conversation with a supplier.”

Solution

ZynBit is hosted on the Azure Microsoft Cloud, which offers performance, scalability and security. Power BI Embedded was a natural fit with its ability to help ZynBit customers visualize customer data. ZynBit evaluated several solutions, focusing on user experience and design, the ability to tell stories with data, and ease of integration to the ZynBit platform.

Power BI Embedded offers a flexible pricing model with no upfront investment for developers, which is great for developers who need time to test and validate their product ideas. The pricing model is also nicely positioned for scale with a consumption model, which makes it easy to calculate and integrate into ISV subscription plans.

Benefits

Prologika helped ZyhBit to transition their data model from Tableau to Power BI and to integrate Power BI reports with Power BI Embedded.

  • The multi-tenant system reduces maintenance
  • Row-level security ensures that the user can access only the data they are authorized to see.
  • Power BI Desktop helps minimize data engineering and development cycles.
  • Out-of-the-box UX supports interactive and dynamic filters, which was a great value add for our customers.
  • The integration with ASP.NET allowed ZynBit to use the Microsoft server side libraries to easily embed dashboards in our ASP.NET MVC pages. Hosting Power BI Embedded reports was extremely easy with Azure.

Customer satisfaction

ZynBit launched in 2015 and didn’t charge initially, because ZynBit wanted to garner as much feedback from potential customers as possible. Now they have  over 4,000 customers across the world using ZynBit to increase sales and deliver a better customer experience. Prospect and customer insights was always at the core of the ZynBit mission . The front end user experience is delivered through Outlook and Gmail and the backend platform is built on the Microsoft Azure Stack for security and scale. The customer was extremely pleased to know Microsoft would be launching Power BI Embedded for developers and now Power BI is part of the ZynIQ user experience.

casestudy

Fortune 50 Company Rolls Out Microsoft BI Across the Enterprise

Excel Power Pivot and Power BI caught like a wild fire within our organization. Microsoft BI empowers our users to travel the entire continuum from self-service BI, team BI, to enterprise BI on a single platform like never before.

Data Architect

This organization wanted to empower business users to analyze their data without reliance on IT. Prologika delivered advisory services to help them plan and roll out Microsoft organizational and self-service BI across the enterprise.

Business Needs

A large organization wanted to adopt Microsoft BI across the enterprise. Historically, it was difficult for IT to meet the growing analytical needs of thousands of business users. Instead, the organization wanted to empower users in all units to gain access to their data and share the BI artifacts they produce in a controlled and secured manner.

The organization wanted to standardized its organizational BI as well. After evaluating several vendors, they decided to adopt the Microsoft BI platform for both self-service and organizational BI. Prologika was awarded the project and was entrusted to help the organization plan and adopt Microsoft BI across the enterprise.

Solution

The Prologika team, consisting of several Microsoft Most Valuable Professionals (MVPs) and industry experts, delivered a set of advisory documents for data architects and executive stakeholders. The first set of documents provided documented processes and best practices that will be utilized for action to effectively implement and support self-service BI, including comparing the tool capabilities, recommending the best Microsoft BI tool for a proposed use, designing and scaling a Microsoft SharePoint BI infrastructure environment, and managed self-service BI best practices.

The second set of documents detailed comprehensive data strategy recommendations, including comparing Microsoft BI data structures, change strategy, monitoring strategy, capacity planning, data quality and master data management, with the goal to utilizing the organizational Microsoft BI platform to deliver a single version of truth.

The third set of documents focused on security and included security best practices that covers the Microsoft SharePoint Enterprise environment with an underlying SQL Server Analysis Services Data Structure and security for deploying self-service BI models to SharePoint.

Prologika also delivered staffing and training documents that detailed the staffing requirements for different BI user types, and training requirements. Prologika also conducted Power BI training classes to train business users on self-service BI with Excel and SharePoint.

Benefits

Under the guidance from Prologika, this organization democratized and open BI to everyone on a single platform – Microsoft BI. It enabled the self-service, team, and organizational BI continuum. Now, a business user can use Excel to create a self-service BI model and deploy to SharePoint, and then share it with his teammates in a controlled and secure environment. If the data model gains popularity, IT can convert the model to an Analysis Services organizational model that is sanctioned and managed by IT.

Empower business users

Business users from all units gain valuable data insights with the tool they use every day – Microsoft Excel. A business user can connect to whatever the data resides, import it in an Excel data model, and create ad hoc pivot reports to perform descriptive analytics. Then, the user can deploy the data model to an enterprise SharePoint farm that is dedicated to BI. Other users can view and interact with the Excel web reports without requiring Excel to be installed locally.

Business users no longer rely on IT to produce paginated reports. Now the role of IT is to facilitate and supervise self-service BI. The best practices we delivered helped IT to establish a scalable and trustworthy environment.

Implement powerful organizational BI solutions

Microsoft BI enabled IT to create powerful solutions for analyzing huge data volumes. For example, management wanted to analyze the data during the peak season in almost real time. A SQL Server data mart was developed to store the operational data within a given retention period. An Analysis Services cube was layered on top of the data warehouse to provide users with instant insights to detect abnormalities, such as longer delivery times, and react immediately to meet SLA contract terms.

Reduce licensing costs

Typically, large organizations tend to accumulate a variety of software tools and this organization is no exception. They still use the best-of-breed tools for specific needs, such as to plot point-to-point routes on a map. However, they realized that using the tools that they already have, allows them to reduce licensing and training costs.

 

a

Insurer Improves Operations with Master Data Management

AXIS Capital, a large property and casualty insurer, was looking for ways to improve the data quality and management of important data entities. Prologika implemented an MDM solution powered by Microsoft SQL Server Master Data Services (MDS) that allows business users to conveniently manage master and reference data in Excel.

Business Needs

Proper data governance is a critical issue for AXIS. Management realized that maintaining a single source of truth (a master database) can help them understand their operations better. That’s because the thorniest problem was traced back to the multiple legacy core systems. Historically, there hasn’t been an easy way to reconcile the differences between disparate source systems and offers organizations an environment within which they might view and manage a “golden copy” of enterprise data.

In addition, regulatory compliance has been an important concern as large U.S. companies now spend millions of dollars on regulatory compliance. AXIS was looking for ways to standardize data in order to ensure compliance.

Solution

Prologika implemented a solution for managing master data and reference data powered by Microsoft SQL Server Master Data Services (MDS). Prologika wrote ETL processed to extract data from different business segments (subsidiaries) and consolidate the data in MDS. Corporate hierarchies were created to define logical data exploration paths.

AXIS formed a data governance group whose members, from finance and IT, would be the data stewards. The group was tasked to figure out how each business subsidiary’s data definitions would transform into standardized datasets that could be standardized and imported into the MDM system.

Now a data steward can use Microsoft Excel to manage important master and reference data inside the tool they use every day. Multiple subscribing systems integrate with MDS to import the master data. For example, the company’s data warehouse imports several dimensions from the corresponding MDS entities.

Benefits

Using the master data management solution developed by Prologika, AXIS improved the data quality of master and referenced data. Historically, large organizations have spent millions of dollars on MDM solutions with questionable returns. Our cost-effective approach helped AXIS save implementation costs and delivered the MDM solution within a few months.

Improving operations

Effective master data management save the company time and money. The result is much more than just clean data. MDM offers AXIS a “single version of the truth” gathered from desperate databases.

Increasing insight

Organizations that maintain a single source of truth(master data) always understandtheir operations better than those that don’t. Clean, accurate data improves the company’s ability to track, measure, and interpretorganizational performance, so it can make betterbusiness decisions.

Ensuring compliance

By standardizing and reconciling main entities, master data management helped the company assure regulatory compliance with regulations, such as the Sarbanes-Oxley Act.

 

 

b

Insurance Company Improves Data Quality

Berkley Specialty Underwriting Managers (BSUM) wanted to improve its data analytics processes by transitioning from self-service to organizational BI. Prologika implemented a classic organizational solution, consisting of a data warehouse, ETL, and a cube that allows users to gain insights with Excel ad hoc reports. ETL processes extractssource data, transforms it, and loads the data warehouse.

Business Needs

BSUM understood that data analytics can help management better understand the company business and be more competitive in the crowded insurance space. Previously, the company attempted to use self-service BI tools but found that the complexity of analytical requirements surpassed the skillset of business users and capabilities of self-service BI tools. Management realized that an organizational BI solution is preferable because it can deliver a single version of truth. This required implementing ETL processes to transform the raw data into a star schema.

BSUM has multiple operational systems for creating policies and processing claims. Over the years, different versions of SQL Server Integration Services (SSIS) were used, ranging from DTS 2000 to SQL Server 2012. Management wanted to standardize all ETL packages on SQL Server 2012 to reduce developer and maintenance effort.

Solution

Prologika designed and implemented an organizational BI solution, consisting of a data warehouse, cube, and ETL. Prologika used its home-grown ETL framework that supports configurable degree of parallelism when running packages in parallel. Depending on the target ETL server, BSUM can configure the framework to paralyze the package execution over a certain number of threads. As a result, the ETL processing window is greatly reduced to a few hours for a full historical load and about 15 minutes for the daily load. The framework also supports incremental extraction for the daily load.

Prologika consultants worked on migrating the BSUM legacy packages to SSIS 2012 with the goal of having all packages converted to SSIS 2012. Not only were the packages converted, but also they use the SSIS 2012 project deployment mode that greatly simplifies ETL development and monitoring.

Benefits

ETL takes 60-80% of the effort for implementing an organizational BI solution. Therefore, it’s important to follow best practices so you don’t end up with an intangible mess of ETL code and you don’t exceed your ETL processing windows. For more information about ETL best practices, read our “Is ETL (E)ating (T)hou (L)ive?” newsletter.

Reduced ETL window

Data is rapidly growing nowadays while ETL processing windows are shrinking. You must do more with less. And, ETL usually becomes a performance bottleneck that stands in the way of your current and future BI initiatives. One way to reduce ETL windows is to run packages in parallel.

Many ETL tasks, such as ODS loads, loading dimensions and independent fact tables, can benefit greatly from parallel execution. The Prologika framework supports a configurable number of parallelism and distributes the packages across parallel flows.

Reduced maintenance effort

Because all packages target SSIS 2012, BSUM standardized its ETL processes. Monitoring and troubleshooting ETL is simplified because the SSIS 2012 project deployment supports task-level performance analysis in addition to easier development.

 

recall

Records Management Firm Saves $1 Million, Gains Faster Data Access with Microsoft BI

With the SQL Server 2012 data warehouse, we won’t need disk arrays in each data center to handle all of that extra volume. This alone will save us $500,000.

Todd Pinniger, Senior Manager of Architecture and Product Development

Recall, a records-management firm, needed faster access to key performance indicators and more intuitive business intelligence (BI) tools. The company consolidated four data centers into a Microsoft SQL Server 2012 data warehouse. The solution’s performance enhancements speed employee access to more detailed data. By consolidating into a data warehouse, the company saved $1 million in hardware and licensing costs.

Business Needs

Recall provides records-management solutions to a worldwide customer base that includes more than 50,000 law firms, banks, governments, healthcare providers, financial firms, and other companies. The company stores paper, digital, and tape files securely or it destroys them as directed by customers.

To support agility and growth, Recall needs to be able to access current key performance indicator (KPI) information about its records inventory, customers, and operations. Previously, to obtain company-wide information, Recall had to synchronize data stored in 35 databases at four regional data centers, including several multi-terabyte databases. The synchronization process was so time-consuming and labor-intensive that company metrics could only be updated once a month. This delay made it difficult for the business to make decisions or answer shareholder information requests as quickly as it wanted to.

The company’s ability to analyze its data was also limited. To perform custom analysis, IT employees had to run reports against all 35 databases worldwide and import the results into a spreadsheet tool, a process that could take weeks.

Solution

Recall considered BI solutions from Panorama, CORDA, QlikView, Tableau, and SAP. The company ultimately decided to base its new solution on Microsoft SQL Server 2012 Enterprise data management software running on the Windows Server 2008 R2 Enterprise operating system. “We were already using Microsoft technology for many critical company systems, so we were very familiar with the environment and confident about its reliability,” says Todd Pinniger, Senior Manager of Architecture and Product Development at Recall.

With help from Microsoft partners Prologika and Berg Information Technology, Recall started deployment in August 2011 and went into production in February 2012. Recall also has 24-hour access to Microsoft Services Premier Support for help resolving problems. The solution includes a cluster with two active nodes, one with SQL Server 2012 Analysis Services and the other running SQL Server 2012 Integration Services. Microsoft SharePoint Server 2010 and SQL Server 2012 Reporting Services run on a second, load-balanced cluster. The software is installed on Dell PowerEdge R810 server computers attached to an EMC Symmetrix storage area network.

The new data warehouse holds about 7 terabytes of data in three databases of about 3 billion rows apiece. The company is using the Microsoft SQL Server 2012 xVelocity memory-optimized columnstore index feature, which speeds the processing and querying of very large data sets by storing index data in columns, not rows.

“Before, certain queries were taking longer than we wanted,” says Carlos Rodrigues, a Business Intelligence Architect at Recall. “After we started using the xVelocity columnstore index, those queries dropped from about 20 minutes to less than 2 minutes.”

Recall has access to additional performance enhancements in SQL Server 2012, such as compression capabilities that can cut data volumes by 50 percent and the SQL Server 2012 Resource Governor, which helps administrators achieve more predictable performance.

For self-service BI capabilities, Recall is using Microsoft SQL Server 2012 Power View, an interactive data exploration and visualization feature in SQL Server Reporting Services that operates in SharePoint Server 2010. Recall is also taking advantage of PerformancePoint Services in Microsoft SharePoint Server 2010, which provides simple tools for building reports, scorecards, and dashboards for monitoring KPIs. The tools are used regularly by approximately 400 analysts and managers.

Benefits

By deploying a SQL Server 2012 solution, Recall has cut costs by U.S.$1 million. Decision makers now have faster access to critical company information, and the IT department has less manual work to do.

Saves $1 Million in Hardware and Licensing Costs

Because the new solution eliminates the need to replicate huge volumes of data in each of the company’s four regional data centers, Recall will no longer need to purchase and maintain as much hardware. “With the SQL Server 2012 data warehouse, we won’t need disk arrays in each data center to handle all of that extra volume,” says Pinniger. “This alone will save us $500,000.”

By consolidating its data warehouses, Recall will also reduce its number of SQL Server licenses. “With a global data warehouse based on SQL Server 2012, we will be able to go from 12 to 4 licenses, saving another $500,000,” says Rodrigues.

Speeds Access to Critical Information

The new solution provides much faster access to KPIs and other reports. “With our previous solution, certain metrics were only updated every month,” says Rodrigues. “With the SQL Server 2012 data warehouse, decision makers no longer have to wait until the end of the month and can instead access updated information on a daily basis. That keeps the business agile and helps us respond more quickly to shareholder requests.”

Cuts Manual IT Workloads

With all of the company’s data in one central repository, performing custom analysis no longer requires intensive effort by IT employees. “Before, it could take as long as two weeks to manually pull data from all 35 local databases, import it into another database, and then run queries against it,” says Pinniger. “And if, after that process, the business wanted to slice that data slightly differently, we would have had to start over. Now, with the SQL Server 2012 data warehouse and xVelocity, that two-week process has been reduced to about an hour’s work, which the business users can perform themselves.”

 

i

A Data Warehouse Design for Leading Retail Company

If you are a small, medium or large size corporation looking to set up any scale of business intelligence I would personally vouch for Teo Lachev and Prologika as the place to go. There is no other place to find a more comprehensive and well-rounded BI consultant.From developing complex dynamic data driven security, performing server optimizations, to leading data warehouse design and many other things, Teo has an encyclopedic and applied knowledge of the complete Business Intelligence Suite and best practices to go with it!

Burzin Daruwalla, BI Developer

Teo has outstanding talent Data warehousing, Business Intelligence, Information systems, and understanding of business value.

Guido Arroyo, System Analyst

inComm, a leading marketer and distributor of stored-value gift and prepaid products, needed an accurate and timely reporting system for internal users and external partners. Prologika delivered a data warehouse, cube, and framework to enable both internal reporting for data analysts and external reporting to inComm customers.

Business Needs

Realizing the value of data analytics, inComm envisioned a BI system but were unsure how to start. They sought external consulting help but were disappointed with the results. Their internal IT department lacked the skills to implement organizational BI solutions. Historically, the entire IT department was focused on producing paginated Reporting Services reports. This was a time consuming process that led to maintenance issues because business metrics were redefined in reports.

Besides internal reporting, inComm wanted to provide reporting capabilities to their external partners. They envisioned an external-facing self-managed portal where an external partner would provision the users who needed access to reports. Then, the portal would authorize the user to see only their data.

Solution

As a first step, Prologika evaluated the existing data warehouse design and extended it to support complex business requirements. Following the industry best practices, Prologika implemented a star schema design. Prologika designed the supporting data structures to authenticate users and enforce data-level security to their data.

Prologika implemented an enterprise Multidimensional cube layered on the top of the data warehouse which had 1.5TB of data. The cube provided excellent performance and promoted a single version of the truth. Then Prologika architected Active Directory and Kerberos-based security for external reporting.

Prologika mentored the inComm staff to get up to speed with date warehousing and cube development.

“Another important part of Teo’s skillset that goes unnoticed are his mentoring skills. Once Teo’s contract was over we did not feel lost or overwhelmed, in fact our entire team was empowered by him with all the knowledge he imparted onto us throughout his tenure at inComm.”

Benefits

Within the help of Prologika, the customer went from zero to modern BI in the course of a few months. The BI system that Prologika implemented delivered enormous value to the client.

Internal insights

With a few clicks, business analysts can create interactive reports to analyze their operations from many different angles. The cube centralizes business logic and metrics in the semantic model so that they are not defined and redefined in the database or in reports. The cube includes over 100 metrics to support comprehensive descriptive analytics, such as averages, variances, growth, ratios, and many more.

From an end-user perspective, data analysis is a matter of dragging and dropping attributes on the report to slice and dice data. Reports tools, such as Excel, auto-generate report queries, and the server takes care of aggregating data, such as to summarize sales at the year level. With a few clicks, users can gain insights across multiple subject areas, such as claims, enrollment, and customer care.

Customer satisfaction

External partners can have accurate reporting without relying on inComm to provide them with data.They have access to analytical and transactional report that help them understand better their business and audit revenue. For their convince, inComm provided ready-to-run paginated reports but partners can create interactive reports by connecting directly to the cube.

Knowledge transfer

The inComm internal developers acquired the necessary skills to extend and maintain the solution.

 

s

Insurance Company Transforms Organizational Data Into Strategic Asset

InSight transforms the organization’s data into a key strategic business asset, empowering employees like never before. It currently includes over 300 performance measures that can be analyzed across various dimensions, enabling our business to calibrate and share insights with rich data storytelling. We are extremely excited for the actionable intelligence and foresight this new tool will bring to our organization!

Andrew Teeple, Director Supply Chain Analytics

eSecuritel, a Brightstar company, needed a BI solution to help them understand and act upon their business.Prologika implemented an innovative self-service BI solution. InSight not only has self-service BI capabilities but provides the organization with a “single version of truth”.

Business Needs

Historically, reporting and analytics at eSecuritel used to be an exhaustive process, impacted by constant changes in the transactional data that made it challenging to report measures consistently across the organization.

When eSecuritel approached Prologika, they gave us a wish list of metrics that they wanted to track across the enterprise. This wish list was compiled by their data analysts but even management was skeptical that the “dream” can materialize one day. The complexity of the source data and the lack of in-house BI expertise influenced the perception that actionable BI remains an unattainable dream.

Solution

InSight is powered by SQL Server Enterprise Edition. The system extracts the transactional data form the source systems and then loads the data into an Operational Data Store (ODS). ODS keeps the raw data in its original form but tracks historical changes.

data

Another set of ETL processes loads the data into a data warehouse database. When doing so, ETL cleans the data and transforms it into a star schema format that is suitable for reporting. As typical with our organizational BI solutions, Prologika implemented a semantic data model powered by SQL Server Analysis Services. Read our “WHY SEMANTIC LAYER?” newsletter to learn more about the advantages of having a data model.

Benefits

The semantic data model brought several important benefits to business users.

  • Great performance – BISM is optimized to aggregate data very fast. Queries involving regular measures and simple calculations should be answered within seconds even, when aggregating millions of rows!
  • Implicit relationships – The semantic layer includes relationships between entities. As a result, end users don’t need to join tables explicitly because the relationships have already been established at design time. For example, a business user doesn’t need to know how to join the Date table to the Claims table or write SQL. You simply add the fields you need on the report, and then the model knows how to relate them!
  • Interactive data analysis – From an end-user perspective, data analysis is a matter of dragging and dropping attributes on the report to slice and dice data. Power BI auto-generates report queries, and the server takes care of aggregating data, such as to summarize sales at the year level.
  • Data security – BISM models apply data security based on the user’s identity, such as to allow a business analyst to access only the subject areas he’s authorized to access.

Single version of the truth

InSight centralizes business logic and metrics in the semantic model so that they are not defined and redefined in the database or in reports. The model includes over 300 metrics to support comprehensive descriptive analytics, such as averages, variances, growth, ratios, and many more.

Interactive data analysis

From an end-user perspective, data analysis is a matter of dragging and dropping attributes on the report to slice and dice data. Reports tools, such as Power BI and Excel, auto-generate report queries, and the server takes care of aggregating data, such as to summarize sales at the year level. With a few clicks, users can gain insights across multiple subject areas, such as claims, enrollment, and customer care.

The semantic layer includes relationships between entities. As a result, end users don’t need to join tables explicitly because the relationships have already been established at design time. For example, a business user doesn’t need to know how to join the Date table to the Claims table or write SQL. You simply add the fields you need on the report, and then the model knows how to relate them!

Instant self-service analytics

Data analysists don’t have to rely on IT anymore to provide the needed data to them. And they don’t have to create disjointed Excel “spreadmarts”. Now they just connect their favorite report tool to the data model and select which measures and dimensions they need on the report. Moreover, the data model is optimized to aggregate data very fast. Queries involving regular measures and simple calculations should be answered within seconds even, when aggregating millions of rows!

recall

Records Management Company Gets Real-time BI, Boosts Sales with Mobile Solution

Ultimately, the new Microsoft mobile BI solution leads to more revenue for Recall and gives us deeper customer insight, helping us stay ahead of our competitors.

Todd Pinniger, Senior Manager of Architecture and Product Development

Recall, a records-management firm, needed faster access to key performance indicators and more intuitive business intelligence (BI) tools. The company consolidated four data centers into a Microsoft SQL Server 2012 data warehouse. The solution’s performance enhancements speed employee access to more detailed data. By consolidating into a data warehouse, the company saved $1 million in hardware and licensing costs.

Business Needs

Recall stores, protects, and streamlines vital information for banks, law firms, government organizations, healthcare providers, financial institutions, and other customers.

More than 200 Recall salespeople and account managers travel frequently, meeting with current and potential customers all over the world. This mobile sales force needed to have access to updated BI data, such as current key performance indicators (KPIs) and customer billing information. “The salespeople need to know detailed customer information prior to a meeting,” says Carlos Rodrigues, Business Intelligence Architect at Recall. “With that kind of information at their disposal, it’s easier to be successful in terms of sales.”

However, viewing this information was challenging because Recall had a limited ability to analyze customer data. “We simply didn’t have BI capabilities at all, let alone mobile BI capabilities,” says Todd Pinniger, Senior Manager of Architecture and Product Development at Recall. “If salespeople wanted updated customer data, they had to send email attachments to their laptops or iPads before going on the road.”

Also, because most of those employees do not have experience creating analytical reports, they relied on IT to build custom reports on specific customers. “The process wasn’t efficient, and it didn’t provide the real-time information they needed prior to meeting with a customer,” states Pinniger.In late August 2011, Recall began searching for a new self-service mobile BI solution.

Solution

After considering new BI solutions from companies including Panorama, CORDA Technologies, QlikTech, Tableau, and SAP, Recall chose to implement a new solution based on Microsoft SQL Server 2012 Enterprise software. “We have used Microsoft technology for many years, and we were eager to deploy this new solution because of our great experiences with Microsoft so far,” says Pinniger.

Prologika delivered a new solution, which wentlive in February 2012, and includes Microsoft SQL Server 2012 Analysis Services and SQL Server 2012 Integration Services, in addition to Microsoft SharePoint Server 2010 and SQL Server 2012 Reporting Services. The new data warehouse contains about 7 terabytes of data. Recall is also using the SQL Server 2012 xVelocity memory-optimized columnstore index feature to speed the processing and querying of large data sets.

To facilitate self-service BI, Prologika implemented PerformancePoint Services in Microsoft SharePoint Server 2010, a feature that provides easy-to-use tools for report building, as well as scorecards and dashboards used for monitoring KPIs.

For mobile self-service BI, Prologika created a URL that links to the new SharePoint site, so several hundred traveling sales professionals now have instant access to detailed customer information from their laptops, tablets, and smartphones. Recall is also taking advantage of Microsoft SQL Server 2012 Power View, a new interactive data-exploration and data-visualization feature in SQL Server Reporting Services that works within SharePoint Server 2010.

Benefits

Using its new Microsoft BI solution, Recall salespeople and account managers have deeper insight into the company’s customers, which helps increase sales. Additionally, Recall has avoided having to develop its own costly mobile BI software.

Gives Mobile Sales Force Easy-to-Use BI Tools

With the new mobile BI solution, Recall salespeople and account managers have a simplified way to get fast, real-time access to critical KPIs and other customer information. “Using PerformancePoint Services in SharePoint Server 2010, our mobile sales teams have an easy self-service BI tool they can access from their mobile devices, no matter where they are,” says Pinniger. “And they don’t have to have special training in analysis and reporting, so they can create reports without IT intervention.”

Mobile users can now view customer details before or during customer meetings. “Previously, salespeople would spend a lot of time generating the latest BI data before a customer meeting,” says Pinniger. “They now have instant access to dashboards and KPI reports on their iPads, so they can view a customer’s current revenue data during a meeting.”

Improves Customer Relationships and Increases Sales

Because they have better visibility into customer information, Recall sales professionals have improved relationships with their key customers. “With a deeper understanding of our customers’ business and needs, our salespeople can give more knowledgeable, thorough input to customers during meetings, which helps foster tighter relationships,” says Pinniger.

It also leads to better sales opportunities. “Because they have better insight, our sales managers can show a potential customer how another branch of that customer’s business is using Recall solutions in another region or country,” Pinniger says. “The sales managers can then more easily sell the solution to the potential customer. Ultimately, the new Microsoft mobile BI solution leads to more revenue for Recall and gives us deeper customer insight, helping us stay ahead of our competitors.”

Avoids Development Costs

Recall also avoided development costs because it did not need to create a separate mobile solution. “We already had the SQL Server 2012 BI solution in place internally, and we use the exact same solution for our mobile users,” states Rodrigues. “We initially looked at purchasing additional software to integrate the mobile functionality, but we saved a lot of time and money by not having to duplicate our development efforts. This is a great solution for our mobile salespeople, and we are very excited about expanding it to other mobile users in the future.”

 

via

An Independent Software Vendor Embraces the Cloud for Community-driven Data Analytics

VIA Consulting, an Independent Software Vendor (ISV), implemented an online platform (SynergyScape) to help communities collaborate and learn together. Prologika helped VIA to scale the platform on Microsoft Azure, and to implement cloud-hosted data analytics.

Business Needs

For VIA, deploying their SynergyScape online platform to the cloud was a no-brainer. The alternative was to make a significant investment in creating and maintaining an on-premises infrastructure. VIA selected Microsoft Azure because of its breath of features and flexible pricing. VIA distributed its platform on multiple Azure virtual machines.

However, as VIA quickly realized, outsourcing the infrastructure to Microsoft Azure was one thing but scaling to thousands of users was quite another.It’s easy to spin off a new Azure virtual machine, but expert help is required to make sure the performance of the cloud-hosted infrastructure meet service level agreements. VIA engaged Prologika to assess their platform and to ensure that can meet the expected OLTP and OLAP data volumes.

Solution

As a first step, Prologika assessed the existing SQL Server deployment and presented its findings and recommendations to the VIA management. Using Visual Studio Online, Prologika conducted load tests to determine the system capacity. Prologika made changes to the OLTP database to make it more scalable. By taking advantage of the innovative Windows Server storage pools, Prologika increased the system capacity by pooling multiple data disks and distributing the OLTP load across multiple data drives.To make the system highly-available, as required by Microsoft to be covered by the Azure service level agreement, Prologika implemented database mirroring so that the primary database fails over to a mirrored server.

With help from Prologika, VIA implemented a data warehouse database on hosted it on another Azure Virtual Machine. Prologika optimized the data warehouse database for analytical loads. As a next step, a hybrid architecture consisting of Analysis Services Tabular model and Power BI was implemented, to empower VIA’s users with interactive data insights.

Benefits

Prologika helped VIA to scale its OLTP and OLAP infrastructure to thousands of users.

Scaling OLTP databases

Because the SynergyScape platform is community driven, sudden capacity increases are common. For example, a celebrity might host a TV interview and encourage his followers to sign it to the SynergyScape platform. This might result in tens to hundreds concurrent OLTP requests. Powered by SQL Server, the VIA’s OLTP database can scale to accommodate such spikes.

Customer satisfaction

Data analytics is the most valued feature of the SynergyScape platform. Previously, VIA used another vendor’s tool for delivering standard reports. However, the reports had performance issues and were lacking in interactive features. By switching to Power BI, VIA delivers interactive and insightful reports to its customers.

With a few clicks, users can create interactive reports to analyze their community data and slice and dice it in different ways. The in-memory model centralizes business logic and metrics and promotes interactive analytics. Data-level model security ensures that users can access only their data.

casestudy

Prologika Optimizes a Large 2.5TB Cube

This organization had a large 2.5 TB cube to analyze website traffic. Prologika optimized the cube and reduced the report query execution times from minutes to seconds.

Business Needs

This organization has implemented a large multidimensional cube to analyze website traffic, search results, and ad campaigns. This cube a strategic role in the company. Senior management based their decisions on the results from the cube.

The cube had grown to 2.5 TB, with some 25 billion rows in the fact table. The customer was complaining about long-running queries that would take minutes to execute. They had doubts if Analysis Services can scale.

Solution

Referred to by the Microsoft Analysis Services team, Prologika assessed the cube architecture with a focus on performance and provided remediation steps to optimize the cube performance.

We discovered issues with the way the MDX queries were written. Specifically, the query WHERE clause used arbitrary-shaped sets. They caused performance issue causing the Analysis Services server to scan all partitions. The cube was partitioned by hour and data was kept for 40 days, resulting in 960 partitions. Although a query might need much less data, the server would scan all the 960 partitions, resulting in enormous amount of data being read and processed. As it turns out, the WHERE clause is not optimized to project the filter on partitions. The solution was to replace the WHERE clause with SUBSELECT to bring the query execution time from minutes to seconds.

Benefits

Thanks to the findings and remediation steps from Prologika, the query execution times were greatly reduced and the company’s faith in Microsoft organizational BI was restored.