Articles

Why I started vizbp…..


Visualization Blueprints – vizbp – is the first marketplace built for reporting and data analysis. Jumpstart your reporting projects with ready-to-use, proven reporting templates/solutions from creators around the world.

https://www.vizbp.com

I am a hustler. All (most) consultants are in one way or another. You have to be to find consistent work.

But, my mind is always thinking of other ways to make money and provide something valuable to others. What can I do or make to make things easier for others.

In my career, I have implemented or been involved in so many different software initiatives where the application or reporting was lacking features or reporting was non-existent. In those scenarios, my first inclination is to look to see if there are any solutions available for download or for purchase. This is typically not a successful venture.

So, I have written code, built reporting or enhanced these solutions myself. My first thoughts thereafter are

  1. How much time could I have saved if I was able to purchase something – even if it was to just get me started and I was able to enhance it? The cost to purchase would likely be much less than the hours put in to start from scratch.
  2. How valuable is that time to a business? The opportunity cost has to be considered. It is not just the developer time, it is the time to solution functionality or data analysis from the business. What is the cost of that delay in business processes?
  3. How can I provide these solutions to other people or businesses like me?

I have tried a few business ventures and have finally found a good platform to bring this idea to fruition. vizbp (short for visualization blueprints) was created as an eCommerce marketplace in the same vein as Etsy, Themeforest, or eBay. It is a digital marketplace where developers (and non-developers) can sell or provide their business intelligence, reporting, code, or applications to what I believe is a huge customer base.

Some code and solutions will be free. Other solutions will be for a cost but will again be much cheaper than if developed from scratch and will provide a head start to productivity or efficiency.

What I believe is the most beneficial feature of this business is the conglomeration of these solutions into a single marketplace/site. There are plenty of sites out there where developers or businesses are selling their individual solutions. Because they are individual sites, they are harder to find. Bringing those solutions together and making them easy to find will improve their sales and make it much easier for the consumer.

From a consultant perspective, placing content or solutions on this site will in turn drive consulting business. If a consumer purchases or downloads free code from the marketplace that you have designed, who will they turn to for support? Who will they contact for enhancements?

After you build those enhancements, what can you do with the enhanced product? Sell it or provide it to others on vizbp!

Bottom line – this site will drive revenue – from passive sales of your already developed products to the consulting revenue to enhance or support the product(s).

If you are interested – or know someone else who may be interested – and have developed code or designed solutions for software that is widely used, go set up a profile. Go let others know about the site. Let your customers know about the site.

The site will only be as good as the content you provide!

You can visit the site to see the features available to you as a customer or as a vendor. You can also contact me with any questions you may have regarding the site:

sales@vizbp.com

https://www.vizbp.com

Continue Reading

One Bite Pizza Power BI Analysis

One Bite, Everybody Knows The Rules.

It was time for a bit of a fun review.

I decided to create a more fun reporting solution this time. If you haven’t watched One Bite pizza reviews from Barstool Sports, you should. It is a funny critique on a venue’s pizza primarily around New York but also wherever Dave travels. It also includes reviews from the community.

From their site:

Find the best slice of pizza using the One Bite app from Barstool Sports. Starting as a video series, Dave “El Presidente” Portnoy has created a cult following of people giving “one bite” pizza reviews by the common man, for the common man. Now you can find all of Dave’s reviews complete with videos, scores and location here. One Bite users can upload their own reviews at any of the 125,000+ pizza restaurants listed across the world.

https://onebite.app/

This Power BI report analyzes the reviews via their publicly accessible API.

https://api.onebite.app/review

https://api.onebite.app/venue

I started by using the new web scraper but the performance was horrible. I was able to loop through the pages and scrape the data but it was sloooooowwwww. It literally took 10+ hours to refresh the reviews.

There was no publicly displayed/documented API for their site. However, I read an article ( https://www.reddit.com/r/PowerBI/comments/a6f861/power_query_web_connector_scraping_function_is/ ) and followed the advice of Data_cruncher to find the API utilized by the site. After the rebuild, it took < 10 minutes to pull the data.

The API does seem to be lacking some historical data so I believe the API started in June of 2019 when they started collecting fan reviews while the site is accessing older reviews directly (not via the API). So, I scraped the old reviews and input them as static JSON so I would not have to refresh that connection when pulling in new data from the API.

I then took the users from the old reviews and added them to the users retrieved from the reviews API as they would not have been included otherwise. As well, I took the venues from the historic reviews and added them to the venues extracted from the reviews.

This API is one of those where you cannot just call the full dataset. It has an offset and a limit. Therefore, you have to create a list of numbers and build your function to be called for each starting number (offset) and increment between numbers (limit). In this scenario, it was limited to 1000 record calls. So, I :

  • built a parameter to hold the number of “loops” I would like to call (pReviewList)
  • built a blank query and added the function to create a list :
    • = {0..Number.FromText(pReviewList)}
  • converted that to a table, multiplied each record by 1000 (increments), converted to text, and called my function

There does seem to be an issue with the review API as it gives an error if you request more records than they have available (i.e. your offset + limit > # records available). This same method works fine for the venue API. Therefore, I called a try otherwise but had to hard-code otherwise with the number of records to try if the 1000 limit failed – though I may revisit and have it loop through record requests until successful.

The report consists of:

  • Cover
    • overview of the number of total reviews, total venues, venues visited by Dave, and average overall score.
    • reviews and score by community and Dave over time
  • Reviews
    • overall score by Dave and the community
    • histogram of scores by Dave and the community
    • score over time by Dave and the community
    • reviews over time by Dave and the community
  • Restaurants
    • map of reviews
    • filter to a country, state, city, or search by address
    • listing of restaurants showing the address, number of reviews, Dave’s score, community score, the median score, and the mode score
    • drillthrough to an individual restaurant
  • Restaurant
    • individual restaurant view
    • overall score based on number of reviews
    • Dave’s score
    • community score
    • restaurant information (address, provider rating/url, restaurant category(s), etc.)
    • individual reviews for the restaurant by user
  • Users
    • map of reviews
    • filter to a country, state, city, or search by address
    • listing of users showing the number of reviews, their average score, their top score, their low score, their median score, their mode score, and number of followers
    • drillthrough to an individual user
  • User
    • individual user view
    • overall score based on number of reviews
    • average score over time
    • map of reviews
    • individual reviews by the user
  • Decomposition Tree
    • drilling down via geography based on a variable measure (typically score)
  • Key Influencers
    • an attempt to see if there are any factors that help influence a higher score
    • what would be great here is if each restaurant was flagged with what type of oven they use (wood-burning, etc.), what type of pizza was reviewed (detroit, ny, deep dish, margherita, etc) chain or not, how long in business, etc. as those would help show a correlation to higher scores
Continue Reading

How to analyze Power BI Usage across workspaces

Workarounds…..workarounds…..workarounds. We all build workarounds as necessary until more enterprise-ready functionality is made available.

This is true for Power BI Usage metrics across workspaces.

We have usage models for each individual workspace but there are several issues with them.

  1. you have to copy the model in order to edit it and make it useful
  2. you are unable to share this model with non-admins or non-members of the workspace (why? – business owners want to see who is using the report, how often, etc.)
  3. you have to make this modification in each and every workspace you want to measure usage against
  4. you are unable to connect to them via XMLA

Therefore, you have to create a workaround. What are your options?

  1. Office 365 audit logs solution
  2. ?
  3. wait for the Power BI group to implement cross-workspace usage analysis

What if you do not have access to the audit logs? What if you do not want to wait and require this information now.

Several groups will download the usage report pbix and export the data they need….too manual. So, I wanted to automate this task. How?

I use a combination of:

  1. Analyze in Excel
  2. PowerShell
  3. Windows Task Scheduler
  4. Power BI Dataflow
  5. Power BI Data Model

* Note – this does not require Power BI Premium but does require you to be an admin of each workspace you would like to gather stats for.

Analyze in Excel

  1. Connect to the Power BI Report Usage Metrics Model in the first of the respective Power BI Workspaces.
  2. Create a Pivot Table with the following fields:
    1. ‘Reports'[ReportGuid]
    2. ‘Reports'[DisplayName]
    3. ‘Dates'[Date]
    4. ‘Users'[UserPrincipalName]
    5. ‘DistributionMethods'[Name]
    6. ‘Views'[ViewersCount]
    7. ‘Views'[ViewsCount]
  3. Repeat steps 1 & 2 for each workspace you would like to gather stats for.
  4. Copy the sheet from each respective Excel file to a main Excel file. This will in turn copy the connections as well.
  5. Save the file to SharePoint on Office365 (or OneDrive).

PowerShell

Create a script to:

  1. Download the Excel file w/ all Pivot Tables from SharePoint on Office365 (or OneDrive).
  2. Refresh the Excel file.
  3. Upload the Excel file back to SharePoint on Office365 (or OneDrive).

Windows Task Scheduler

Schedule the PowerShell task to run on whatever schedule you would like.

  • Action = Start a program
  • Program = powershell.exe
  • Arguments = -ExecutionPolicy Bypass -File “<directory/path to PowerShell script>”
  • Run as local administrator

Power BI Dataflow

Schedule a dataflow to pick up the data from the Excel file (cycle through tabs), aggregate the data, etc.

Power BI Data Model / Report

Create a report using the Power BI usage data to visualize the usage, join to other data (i.e. from Power BI API – report/dataset relationship, owner of the dataset, etc.).

Share access to the report, use it in other data models to show report-specific data in an individual report, etc. Do with it what you will as they say.

Conclusion

Because there are so many variable parts to this solution, I have held off on creating this as a solution on vizbp.com. As well, this is a temporary solution as the hope is that the Microsoft release for this functionality does not restrict it to Power BI Admins or Capacity Admins only and allows this report to be shared (or accessed via XMLA) by the organization.

If there is enough interest, I can look to create this as a template on vizbp.

Continue Reading

Infor Syteline Reports in Power BI

In working with applications and ERP systems, you will find they are lacking in in-depth analysis and reporting. There comes a time where you realize there is a need for external reporting to augment the standard reporting. This is how the Infor Syteline Reports in Power BI came about.

In one of my previous roles and past consulting engagements, I built these Syteline reports in Power BI for:

  1. Analyzing the general ledger data
  2. Adding insight into estimates, orders, jobs, job hours, shipping/receiving, rma(s), order references, and customer/item/vendor addition.
  3. Providing extra insight into daily sales and invoicing activity

These came out of genuine business requests for data to help their everyday operations.

The Daily Sales Report in particular was built in order to ensure the companies were moving sales through the channels in a timely manner and invoicing their open orders before the end of the month. Are there too many order dollars in backorder, ordered but not shipped, or even more importantly – shipped but not invoiced? How much has been input this month? How much has been invoiced this month? How does that compare to the budget? How much is projected to be invoiced?

The General Ledger Report simply analyzes your general ledger data by unit codes 1-4 over time.

The Order Analysis Report provides valuable insight into your estimates, orders, jobs, job hours, shipping/receiving, rma(s), order references, and customer/item/vendor addition. Slice and dice estimates, orders, jobs, shipping/receiving, and rma(s), and compare job estimates to actuals. Review orders broken down by reference type as they are booked to analyze counts, value, margin, and gross margin percentage to find orders booked incorrectly before they are invoiced.

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

Continue Reading

How to compare Power BI Models

Want to compare models to document improvement?  Want to compare your DEV/QA model with your model in PRODUCTION?  Want to know the model size improvement of modifications made to a model? The ability to compare and analyze models, table, column sizing, and relationships allows you to show the improvements made to a model via optimization techniques. Enter the Power BI Model Comparison solution @ vizbp.

YOU DO NOT HAVE TO BE AN ADMIN with this product – just an admin of the Premium workspaces you want to analyze!

Premium is required currently until XMLA connectivity is opened up to non-premium workspaces.

There are other solutions available for portions of this functionality but most are only available for single models (I will reference these at the end of the post).

What do you do if you want to document or analyze improvements made to your solution? With existing solutions, you have to download each of your pbix files you would like to compare, run analysis on each one individually, and manually document the sizes, comparison, and improvements. You still can perform this analysis – don’t get me wrong. The value in this product just gives you a huge head start on that development and costs much less than the hours it would take to develop a product like this from scratch or to manually document improvements. It even has some capabilities that the other solutions do not.

It is a fully functional data model – joining all Power BI models and DMV queries to be able to see data model size, table size, column size, measure DAX expressions, relationships, etc.. It also has all of the metrics and visuals built out to easily analyze your Power BI solutions.

If you want to connect to a single model, there are a lot of options (I will list those at the bottom of the post). I even built a version that would find the open pbix port via R and query that single model. This works great during development but not when wanting to analyze your full environment.

It utilizes the Power BI API to query your Power BI environment for the workspaces of which you are an admin and the underlying datasets you specify. It then queries each one utilizing DMV functions to extract the dataset components (tables, columns, measures, relationships, etc.).

If you want to connect to the Power BI API and build this solution yourself, you very well can. You just need to create an Azure Active Directory Application and download some freely available connectors and/or utilize various methods to skirt around the OAuth2 authentication protocol (I will include links below) and you can begin connecting to your Power BI environment. In order to build reporting, you have to do some hi-jinks and jump through some hurdles (and be pretty good at M) in order to build out a functioning model.

You also have to know DMV queries and how to join them all together:

But, believe me, it takes dozens to hundreds of hours to build out a solution like this.

If you have the time, I encourage you to go for it. It is an experience building something like this that is so valuable to an organization or to the individuals like me. I want my Power BI solutions and environment (Pro or Premium) to be fully optimized. I thought:

  1. There has to be a way to connect to the solutions I have access to.
  2. There has to be a way to pull two data models in a single view.
  3. There has to be a way to compare those models to document improvements to data model size, changes to measures, reduction in the number of columns or tables, etc.

I have built a product that does these things and more!

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

Power BI Model Comparison

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

References to other Power BI model analyzer solutions (my apologies if I did not reference yours – let me know if I failed to and I will add yours as well):

Continue Reading

PowerApps Report Survey to collect user feedback

Need to collect end user feedback on a reporting solution? The ability to create a simple, unobstructed feedback loop is critical to end user adoption and usability of your reporting environment.

This is a very simple PowerApps solution built on the PowerApps Employee Engagement Survey Example.

It collects critical information on the completeness of data, visuals, speed, overall score, and comments about a specific Power BI or reporting solution.

It is made to be embedded in a Power BI report in order to easily collect the information from consumers while they are accessing the report directly.

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

PowerApps Report Survey

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

Continue Reading

Power BI + Power Automate Refresh Trigger

Need to trigger a refresh on a Power BI solution?  Want to trigger a refresh based on a completed ETL run or when a file is created in a SharePoint directory? Want to notify the dataset owner when the refresh begins, when it completes successfully, or only when the refresh fails? The ability to trigger a refresh not based on a schedule but on a variable trigger and notify the owner of the dataset if the refresh fails is critical to an optimized environment and environment oversight.

There is a new Microsoft Automate (Microsoft Flow) action to refresh a Power BI solution. So, what does this solution do that is different from that solution?

With a Power BI triggered refresh via the API, there is no notification if there is a failure. This solution:

  1. Allows for sending a notification when a refresh is triggered.
  2. Checks the status of the manual refresh every 5 minutes (configurable).
  3. Allows for emailing the owner on success.
  4. Allows for emailing the owner on failure.
  5. If you capture the initiator of the trigger, you can email that person when triggered and upon success/failure.

This solution will be updated periodically as new feature requests and other thoughts on what could be added to improve the solution. Make sure you’re running the latest version. The application will have a version displayed and notify you if there is a new version available.

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

Power BI + Automate Refresh Trigger

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

Continue Reading

Power BI + Automate Gateway Status

Need to be notified when your Power BI Gateway Datasource loses connectivity? The ability to identify connectivity issues before your end users or sponsors  is critical to end user adoption in your Power BI environment.

This solution will be updated periodically as new feature requests and other thoughts on what could be added to improve the solution. Make sure you’re running the latest version. The report will have a version displayed and notify you if there is a new version available.

The Power BI + Power Automate Gateway Status solution utilizes the free Power BI API + Power Automate Custom Connector

Based on your trigger configuration, this can check all of your data sources for all or single gateway configured in your Power BI environment.  Notify owners of the gateway(s) or whomever you specify in the flow.

This allows you to resolve issues with a data source failing on your gateway prior to end users noticing.

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

Power BI + Power Automate Gateway Status

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

Continue Reading

Why you need a Power BI Report Template

Why do you need a Power BI Report Template?

When developing reports for an organization, there are two primary considerations for a Power BI Report Template.

  1. A consistent look and feel across your Power BI solutions.
  2. Consistent components for your Power BI solutions.
  3. Simplified development process for your group.

A consistent look and feel

Include a default page w/ standard titles, layouts, slicers, bookmarks, etc.

Consistent components

Include a last refresh timestamp, a cover page with metadata and contact information, a definitions page, etc.

Simplified development process

In Query Editor, include connections to your most commonly used data sources (set to not load and not refresh) so developers can pick their connections to reference for downstream connections and queries. Include commonly used Power Query functions. Include connections to your Power BI dataflows. This makes the development process just a bit faster to initiate.

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

Power BI Report Template

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

Continue Reading

Power BI + Power Automate API Connector

There are many use cases to connect Power BI + Power Automate via the Power BI API.

  • Refresh a solution on a different schedule than is available in the service (i.e. monthly on the 1st, not @ a specific time – triggered by the end of a variable ETL process, kicked off via PowerApps after a manual load of source data, etc.)
  • Check status of refresh at any time
  • Check gateway status
  • Download pbix files and save to a shared drive
  • and more!

If you want to add administration monitoring capabilities and notification to your Power BI environment or have the ability to connect and automate processes in your Power BI environment, Power BI + Power Automate is a great combination to utilize.

This is a custom connector created for connecting to the Power BI API from Power Automate (formerly Microsoft Flow) or Azure Logic Apps.

I had issues utilizing this one: https://github.com/microsoft/PowerBI-CSharp/blob/master/sdk/swaggers/swaggerV2.json

I did not have issues with this one: https://github.com/microsoft/powerbi-rest-api-specs/blob/master/swagger.json

To import from an OpenAPI / swagger file, follow the directions here: https://docs.microsoft.com/en-us/connectors/custom-connectors/define-openapi-definition

By far, this is the best article to follow on configuring your Power BI Custom Connector (after importing from swagger/OpenAPI file): https://medium.com/@Konstantinos_Ioannou/refresh-powerbi-dataset-with-microsoft-flow-73836c727c33

Take a look at the product and see if it would be valuable for you or your organization. It is available on vizbp.com:

Power BI + Power Automate API Connector

While there, take a look at other solutions people like me are making available for free or for purchase. Take a look at becoming a seller yourself and offering your solutions in the marketplace.

https://vizbp.com/sell/

While here, take a look at my other articles related to vizbp.com:

Continue Reading

Power BI Wide World Importers Demo

This blog will start off with a word (or a lot of words) of advice for job seekers looking for a new Power BI development position.

Bring Power BI examples.

DO NOT come empty-handed and explain that you cannot show your prior work because it was confidential data.

There are plenty of free data sources that you can get your hands on, show that you have initiative, and show that you fine tune your craft outside of the assigned projects you have worked on.

I am not going to list all of the free data sources available for building sample data models and visuals. There are plenty of resources out there if you search for them. I will include a few links to sites that aggregate those free data sources below:

But…..

DO NOT provide samples with confidential data that you created at a prior employer. This is a huge red flag that you are not concerned about privacy, protecting data, etc.

DO NOT steal and present someone else’s solution as your own.

DO NOT quickly build a model without implementing best practices.

DO take the time to ensure the data model is clean, you have parameterized your data sources (or at least referenced them on other queries, you are implementing query folding where possible, used friendly/descriptive naming on your Power Query steps, use query grouping for organizational purposes, add documentation in code as necessary, denormalized as necessary to move towards a star schema and away from snowflake, etc.

DO take the time to show you pay attention to details. Align visuals, evenly distribute them, use friendly titles on visuals, rename visuals (in the selection panel is easier) and while there, go ahead and group them nicely (make sure you are using modern headers to do so), make sure visual interactions are optimal, make sure column spacing is wide enough to accommodate totals, use conditional formatting where appropriate, etc.

DO take the time to update your old models to utilize new features and/or visuals; showcase them in your solutions. This shows that you keep up with new features and functionality.

Veering away from that topic now…..I am providing a quick sample of a Power BI model (pbix) built on a portion of the Wide World Importers data warehouse provided by Microsoft: https://github.com/Microsoft/sql-server-samples/releases/tag/wide-world-importers-v1.0

Feel free to use it, put your own spin on the visuals, and provide it to perspective companies as an example.

There are other fairly standard models here for use as well: https://docs.microsoft.com/en-us/power-bi/sample-datasets#the-power-bi-samples-as-pbix-files

They could definitely use an overhaul on the front end but that is why they are perfect for this scenario! You can show the original and then show your upgraded (hopefully) version to detail improvement.

What would be the best example to give? Find a poorly modeled model, implement best practices, and show the before and after – along with all of the noted improvements and best practices implemented. Reduce the granularity from 10 million to 2 million records? Reduce the size of the model by 50%? Increase the speed of a DAX measure by 30%? Add metadata definitions to the model? Ensure query folding was occurring where possible? Note it all and what the benefit of doing so is.

PROVIDE POTENTIAL EMPLOYERS EXAMPLES THAT SHOW YOU HAVE AN UNDERSTANDING OF THE TECHNOLOGY!

Continue Reading