How to Call a Machine Learning Model deployed with azure ML from Python

import requests import json URL for the web service scoring_uri = ‘http://93263292-1a32-4a54-86d5-abc31c4f7314.northeurope.azurecontainer.io/score’ If the service is authenticated, set the key or token key = ” Two sets of data to score, so we get two results back data = “{\”input_df\”: [{\”feature1\”: value1, \”feature2\”: value2}]}” data={“data”:[{“ad_id”:299545880.0,”price”:10000000,”rooms”:2.0,”is_old”:”Neubau”,”post_code”:1030.0,”energy_class”:”B”,”property_type_val”:”Wohnung”,”living_area”:57.0,”collectiondate”:”2019-03-08T00:00:00.000Z”,”filename”:”dbfs:\/mnt\/willhaben\/8032019\/299545880.json”}]} data={“data”:[{“rooms”:3,”is_old”:”Neubau”,”post_code”:1030.0,”energy_class”:”B”,”property_type_val”:”DG+Wohnung”,”living_area”:100.0}]} print(data) Convert to JSON string input_data = json.dumps(data) Set

Submit a simple Job to Azure Batch with Node js (minimal example)

In this blog i will describe how to use Azure Batch to run an arbitrary script that reads data from an Input Container (on azure blob storage) and writes data to an Output Container (also on Azure blob storage). The script itself is also stored in a Container named Resources on the same azure blob

Connect Node.js to SQL Azure with Tedious

The following shows how to create a simple Node.js project that reads and writes data to SQL Azure. To get a clean foundation for further development i decided to split things into multiple files (dataaccess, model and management). I also added a appdbtest.js file to demo how to use the management class to create new

How to create a docker image with a node js express website, push it to azure container registration(ACR), create a linux web app for containers and configure it to run the docker image stored in the ACR.

  I just created a very simple node.js express web app and wanted to deploy it to azure.   Install Node.js and Docker for Windows. Create a Folder, navigate to it from a commandline and run npm Init. Then run npm install express –save to install the java script express framework Then create a app.js

R-Density Plot (Histrogram) for a Power BI Performance Baseline

install.packages(“ggplot2”) install.packages(“scales”) just drag the duration column of your dataset to the R-Chart Control in Power BI and add the following code into your R-Control: library(“ggplot2”) library(“scales”) ggplot(dataset)+ geom_histogram(binwidth=500,aes(x = duration/1000, y = ..ncount..)) + scale_y_continuous(labels = percent_format()) you can also try changing the binwidth and filter out very fast queries with a power BI

Generate a SQL Script to run page-compression on all SQL Tables that exceed a certain saving rate threshold

This Script is reusing a script from Eli Leiba published here: https://www.mssqltips.com/sqlservertip/2381/sql-server-data-compression-storage-savings-for-all-tables/  and extends it to also generate the compression-script where the saving-rate exceeds a certain threshold (in the script below the threshold is 20%) –DISCLAIMER: –This code is not supported under any Microsoft standard support program or service. –This code and information are provided

Tricks you should know when you use Reporting Services for enterprise-scale projects.

Reporting Service tips and tricks Reporting Services is a great product that can render everything that you can imagine. However sometimes it requires some tricks to get the job done. The following are lessons learned in my last large reporting services project. (The text is a quick and dirty summary – maybe i will take

Brexit – Economic Impact Simulation

The following simulation shows why the Brexit may not be a good idea. Assume you have several hundred projects across europe and a few experts in each country to deliver them. Companies usually try to model and optimize the assignment of experts to projects. It is quite interesting to see that adding new political constraints