Business Intelligence for developers
I'm a developer with apps backed with SQL databases. I want to give my CEO access through BI self-serving tools such as PowerBI or Tableau.
My issue is that the affordable tools all seem to require for someone to define the model in an interactive application leaving no means of automization.
In practice, that means that every time I add some column somewhere I need to fire up some sluggish desktop application, manually add the column in the model there too and then trigger some deployment process. I'm a developer, I need to automate such things in order to not be overwhelmed with complexity.
I've looked at PowerBI's file format, but it's binary. I've looked at both theses product's REST APIs, but all they allow is upload that binary and undocumented file format.
I'm looking for
- a way to create a model programmatically with these tools or
- affordable alternatives that allow that.
I'm otherwise satisfied with both these tools.
I also had a look at Looker, but that already costs thousands of dollars per month.
business-intelligence powerbi
add a comment |
I'm a developer with apps backed with SQL databases. I want to give my CEO access through BI self-serving tools such as PowerBI or Tableau.
My issue is that the affordable tools all seem to require for someone to define the model in an interactive application leaving no means of automization.
In practice, that means that every time I add some column somewhere I need to fire up some sluggish desktop application, manually add the column in the model there too and then trigger some deployment process. I'm a developer, I need to automate such things in order to not be overwhelmed with complexity.
I've looked at PowerBI's file format, but it's binary. I've looked at both theses product's REST APIs, but all they allow is upload that binary and undocumented file format.
I'm looking for
- a way to create a model programmatically with these tools or
- affordable alternatives that allow that.
I'm otherwise satisfied with both these tools.
I also had a look at Looker, but that already costs thousands of dollars per month.
business-intelligence powerbi
add a comment |
I'm a developer with apps backed with SQL databases. I want to give my CEO access through BI self-serving tools such as PowerBI or Tableau.
My issue is that the affordable tools all seem to require for someone to define the model in an interactive application leaving no means of automization.
In practice, that means that every time I add some column somewhere I need to fire up some sluggish desktop application, manually add the column in the model there too and then trigger some deployment process. I'm a developer, I need to automate such things in order to not be overwhelmed with complexity.
I've looked at PowerBI's file format, but it's binary. I've looked at both theses product's REST APIs, but all they allow is upload that binary and undocumented file format.
I'm looking for
- a way to create a model programmatically with these tools or
- affordable alternatives that allow that.
I'm otherwise satisfied with both these tools.
I also had a look at Looker, but that already costs thousands of dollars per month.
business-intelligence powerbi
I'm a developer with apps backed with SQL databases. I want to give my CEO access through BI self-serving tools such as PowerBI or Tableau.
My issue is that the affordable tools all seem to require for someone to define the model in an interactive application leaving no means of automization.
In practice, that means that every time I add some column somewhere I need to fire up some sluggish desktop application, manually add the column in the model there too and then trigger some deployment process. I'm a developer, I need to automate such things in order to not be overwhelmed with complexity.
I've looked at PowerBI's file format, but it's binary. I've looked at both theses product's REST APIs, but all they allow is upload that binary and undocumented file format.
I'm looking for
- a way to create a model programmatically with these tools or
- affordable alternatives that allow that.
I'm otherwise satisfied with both these tools.
I also had a look at Looker, but that already costs thousands of dollars per month.
business-intelligence powerbi
business-intelligence powerbi
asked Sep 4 '18 at 12:23
JohnJohn
271111
271111
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Power BI uses Power Query (M) to extract and transform the data that is loaded into the data model. A Power Query statement is broken into individual steps that are reevaluated each time the report dataset is refreshed. This fluidity can be leveraged to return dynamically structured datasets. Power Query can consume unstructured data, such as JSON or XML, and programmatically expand each element into separate columns as part of its sequence of applied steps.
Once the Power BI report is published, each time it is refreshed, it will retrieve the current unstructured base data from the source JSON column. Then, it will parse & update the column structure in the resulting dataset automatically, without needing to manually refresh the data & republish using Power BI Desktop.
If the source data is not already in an unstructured format, you will need to prepare the raw tabular data for Power Query ingestion. The desired columns will need to be rendered into key/pairs in a valid JSON string in the source table that you will reference in Power Query Editor. Make sure you put these columns into a single JSON object for each row, as opposed to putting the key/pair values for every row into nested objects within a single element. SQL Server 2016 and above have some JSON formatting tools that can assist with this transformation.
The required Power Query functions are as follows:
Table.TransformColumns to return the contents of a valid JSON string into a Json.Document object. This applied step can be easily inserted by selecting "Transform -> JSON" from the context menu of the column header in Power Query Editor.
JSONTableName = Table.TransformColumns(sourceTableName,{{JSONStringColumnName, Json.Document}})
Table.ExpandRecordColumn to parse the JSON object key into columns. In order to have the columns dynamically rendered you will also need to use a combination of Record.FieldNames, Record.Combine, & Table.Column.
DynamicTableName = Table.ExpandRecordColumn(JSONTableName, JSONStringColumnName, Record.FieldNames(Record.Combine(Table.Column(JSONTableName, JSONStringColumnName))))
One consideration for this dynamic approach, if you remove a key/value pair from the JSON object, it will, by design, remove the column from the refreshed dataset. Even though any visualizations that are using that particular column will then show an error, it is not a critical issue and can automatically be fixed by clicking the "Fix This" button. However, to avoid this altogether, it is best to avoid removing columns from the structure.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "182"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f216667%2fbusiness-intelligence-for-developers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Power BI uses Power Query (M) to extract and transform the data that is loaded into the data model. A Power Query statement is broken into individual steps that are reevaluated each time the report dataset is refreshed. This fluidity can be leveraged to return dynamically structured datasets. Power Query can consume unstructured data, such as JSON or XML, and programmatically expand each element into separate columns as part of its sequence of applied steps.
Once the Power BI report is published, each time it is refreshed, it will retrieve the current unstructured base data from the source JSON column. Then, it will parse & update the column structure in the resulting dataset automatically, without needing to manually refresh the data & republish using Power BI Desktop.
If the source data is not already in an unstructured format, you will need to prepare the raw tabular data for Power Query ingestion. The desired columns will need to be rendered into key/pairs in a valid JSON string in the source table that you will reference in Power Query Editor. Make sure you put these columns into a single JSON object for each row, as opposed to putting the key/pair values for every row into nested objects within a single element. SQL Server 2016 and above have some JSON formatting tools that can assist with this transformation.
The required Power Query functions are as follows:
Table.TransformColumns to return the contents of a valid JSON string into a Json.Document object. This applied step can be easily inserted by selecting "Transform -> JSON" from the context menu of the column header in Power Query Editor.
JSONTableName = Table.TransformColumns(sourceTableName,{{JSONStringColumnName, Json.Document}})
Table.ExpandRecordColumn to parse the JSON object key into columns. In order to have the columns dynamically rendered you will also need to use a combination of Record.FieldNames, Record.Combine, & Table.Column.
DynamicTableName = Table.ExpandRecordColumn(JSONTableName, JSONStringColumnName, Record.FieldNames(Record.Combine(Table.Column(JSONTableName, JSONStringColumnName))))
One consideration for this dynamic approach, if you remove a key/value pair from the JSON object, it will, by design, remove the column from the refreshed dataset. Even though any visualizations that are using that particular column will then show an error, it is not a critical issue and can automatically be fixed by clicking the "Fix This" button. However, to avoid this altogether, it is best to avoid removing columns from the structure.
add a comment |
Power BI uses Power Query (M) to extract and transform the data that is loaded into the data model. A Power Query statement is broken into individual steps that are reevaluated each time the report dataset is refreshed. This fluidity can be leveraged to return dynamically structured datasets. Power Query can consume unstructured data, such as JSON or XML, and programmatically expand each element into separate columns as part of its sequence of applied steps.
Once the Power BI report is published, each time it is refreshed, it will retrieve the current unstructured base data from the source JSON column. Then, it will parse & update the column structure in the resulting dataset automatically, without needing to manually refresh the data & republish using Power BI Desktop.
If the source data is not already in an unstructured format, you will need to prepare the raw tabular data for Power Query ingestion. The desired columns will need to be rendered into key/pairs in a valid JSON string in the source table that you will reference in Power Query Editor. Make sure you put these columns into a single JSON object for each row, as opposed to putting the key/pair values for every row into nested objects within a single element. SQL Server 2016 and above have some JSON formatting tools that can assist with this transformation.
The required Power Query functions are as follows:
Table.TransformColumns to return the contents of a valid JSON string into a Json.Document object. This applied step can be easily inserted by selecting "Transform -> JSON" from the context menu of the column header in Power Query Editor.
JSONTableName = Table.TransformColumns(sourceTableName,{{JSONStringColumnName, Json.Document}})
Table.ExpandRecordColumn to parse the JSON object key into columns. In order to have the columns dynamically rendered you will also need to use a combination of Record.FieldNames, Record.Combine, & Table.Column.
DynamicTableName = Table.ExpandRecordColumn(JSONTableName, JSONStringColumnName, Record.FieldNames(Record.Combine(Table.Column(JSONTableName, JSONStringColumnName))))
One consideration for this dynamic approach, if you remove a key/value pair from the JSON object, it will, by design, remove the column from the refreshed dataset. Even though any visualizations that are using that particular column will then show an error, it is not a critical issue and can automatically be fixed by clicking the "Fix This" button. However, to avoid this altogether, it is best to avoid removing columns from the structure.
add a comment |
Power BI uses Power Query (M) to extract and transform the data that is loaded into the data model. A Power Query statement is broken into individual steps that are reevaluated each time the report dataset is refreshed. This fluidity can be leveraged to return dynamically structured datasets. Power Query can consume unstructured data, such as JSON or XML, and programmatically expand each element into separate columns as part of its sequence of applied steps.
Once the Power BI report is published, each time it is refreshed, it will retrieve the current unstructured base data from the source JSON column. Then, it will parse & update the column structure in the resulting dataset automatically, without needing to manually refresh the data & republish using Power BI Desktop.
If the source data is not already in an unstructured format, you will need to prepare the raw tabular data for Power Query ingestion. The desired columns will need to be rendered into key/pairs in a valid JSON string in the source table that you will reference in Power Query Editor. Make sure you put these columns into a single JSON object for each row, as opposed to putting the key/pair values for every row into nested objects within a single element. SQL Server 2016 and above have some JSON formatting tools that can assist with this transformation.
The required Power Query functions are as follows:
Table.TransformColumns to return the contents of a valid JSON string into a Json.Document object. This applied step can be easily inserted by selecting "Transform -> JSON" from the context menu of the column header in Power Query Editor.
JSONTableName = Table.TransformColumns(sourceTableName,{{JSONStringColumnName, Json.Document}})
Table.ExpandRecordColumn to parse the JSON object key into columns. In order to have the columns dynamically rendered you will also need to use a combination of Record.FieldNames, Record.Combine, & Table.Column.
DynamicTableName = Table.ExpandRecordColumn(JSONTableName, JSONStringColumnName, Record.FieldNames(Record.Combine(Table.Column(JSONTableName, JSONStringColumnName))))
One consideration for this dynamic approach, if you remove a key/value pair from the JSON object, it will, by design, remove the column from the refreshed dataset. Even though any visualizations that are using that particular column will then show an error, it is not a critical issue and can automatically be fixed by clicking the "Fix This" button. However, to avoid this altogether, it is best to avoid removing columns from the structure.
Power BI uses Power Query (M) to extract and transform the data that is loaded into the data model. A Power Query statement is broken into individual steps that are reevaluated each time the report dataset is refreshed. This fluidity can be leveraged to return dynamically structured datasets. Power Query can consume unstructured data, such as JSON or XML, and programmatically expand each element into separate columns as part of its sequence of applied steps.
Once the Power BI report is published, each time it is refreshed, it will retrieve the current unstructured base data from the source JSON column. Then, it will parse & update the column structure in the resulting dataset automatically, without needing to manually refresh the data & republish using Power BI Desktop.
If the source data is not already in an unstructured format, you will need to prepare the raw tabular data for Power Query ingestion. The desired columns will need to be rendered into key/pairs in a valid JSON string in the source table that you will reference in Power Query Editor. Make sure you put these columns into a single JSON object for each row, as opposed to putting the key/pair values for every row into nested objects within a single element. SQL Server 2016 and above have some JSON formatting tools that can assist with this transformation.
The required Power Query functions are as follows:
Table.TransformColumns to return the contents of a valid JSON string into a Json.Document object. This applied step can be easily inserted by selecting "Transform -> JSON" from the context menu of the column header in Power Query Editor.
JSONTableName = Table.TransformColumns(sourceTableName,{{JSONStringColumnName, Json.Document}})
Table.ExpandRecordColumn to parse the JSON object key into columns. In order to have the columns dynamically rendered you will also need to use a combination of Record.FieldNames, Record.Combine, & Table.Column.
DynamicTableName = Table.ExpandRecordColumn(JSONTableName, JSONStringColumnName, Record.FieldNames(Record.Combine(Table.Column(JSONTableName, JSONStringColumnName))))
One consideration for this dynamic approach, if you remove a key/value pair from the JSON object, it will, by design, remove the column from the refreshed dataset. Even though any visualizations that are using that particular column will then show an error, it is not a critical issue and can automatically be fixed by clicking the "Fix This" button. However, to avoid this altogether, it is best to avoid removing columns from the structure.
answered Jan 5 at 0:16
solutionistsolutionist
864
864
add a comment |
add a comment |
Thanks for contributing an answer to Database Administrators Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f216667%2fbusiness-intelligence-for-developers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown