Skip to content
Paul Hernandez playing with Data

Paul Hernandez playing with Data

Business Intelligence, Data Analysis, Advanced Analytics and more …

    • About
  • July 30, 2020

    Azure Digital Twins Management with Postman

    In this video I will show how to manage Azure Digital Twins models and instances using Postman. How to use the ADT explorer is explained in my previous post: https://hernandezpaul.wordpress.com/2020/07/24/azure-digital-twins-and-adt-explorer-say-hello In order to make the postman collection work you need to configure an environment as follows: tenantId = your tenant Id, it could be found…

  • July 24, 2020

    Azure Digital Twins and ADT Explorer – say hello!

    Azure Digital Twins and ADT Explorer – say hello!

    Introduction Azure Digital Twins Service offers a way to build next generation IoT solutions. There are other approaches on the market to describe IoT devices and build digital twins. Without making a formal comparison I can say with the Azure Digital Twins is possible to build a powerful semantic layer on top of your connected…

  • December 17, 2019

    Streaming Technologies Comparison

    After several time I decided to share my notes about comparing different open source streaming technologies on LinkedIn Streaming Technologies Comparison https://www.linkedin.com/pulse/streaming-technologies-comparison-paul-hernandez

  • October 10, 2017

    Installing Apache Zeppelin 0.7.3 in HDP 2.5.3 with Spark and Spark2 Interpreters

    Background As a recent client requirement I needed to propose a solution in order to add spark2 as interpreter to zeppelin in HDP (Hortonworks Data Platform) 2.5.3 The first hurdle is, HDP 2.5.3 comes with zeppelin 0.6.0 which does not support spark2, which was included as a technical preview. Upgrade the HDP version was not…

  • May 2, 2017

    Talend job to lookup geographic coordinates into a shape file

    Talend job to lookup geographic coordinates into a shape file

    Introduction Recently for an open data integration project I had to select some tools in order to be able to process geospatial data. I had a couple of choices: I could use R and try to work out a solution with the packages available on the server or use Talend. One of the biggest restrictions…

  • January 25, 2017

    Connect to Hive using Teradata Studio 16

    Connect to Hive using Teradata Studio 16

    Introduction Teradata Studio is the client used to perform database administration task on Aster and Teradata databases, as well as moving data from and to Hadoop. Recently I was asked to test a solution to integrate Hadoop with Teradata in order to build a modern Data Warehouse architecture, this was my first step and I…

  • January 24, 2017

    Teradata Express 15.10 Installation using Oracle VirtualBox

    Teradata Express 15.10 Installation using Oracle VirtualBox

    Introduction For professional reasons I needed to start learning Teradata after some years of intensive Microsoft BI projects. To start breaking the ice and have a playground to test everything I want, I decided to download the newest Teradata Express virtual machine (TDE), which comes with the 15.10 engine plus some additional tools. In my…

  • November 14, 2016

    Apache Zeppelin installation on Windows 10

    Disclaimer: I am not a Windows or Microsoft fan, but I am a frequent Windows user and it’s the most common OS I found in the Enterprise everywhere. Therefore, I decided to try Apache Zeppelin on my Windows 10 laptop and share my experience with you. The behavior should be similar in other operating systems.…

  • July 8, 2016

    Introduction to R Services and R client – SQL Server 2016

    Introduction After some time using R and SQL server as two different tools (not 100% true because I already have imported data from SQL Server into R Studio), now Microsoft is offering as part of the SQL Server 2016 R services. That seems to be very promising, especially for Microsoft BI professionals. One of the…

  • June 16, 2016

    Export data to Hadoop using Polybase – Insert into external table

    Export data to Hadoop using Polybase – Insert into external table

    Introduction This post is a continuation of Polybase Query Service and Hadoop – Welcome SQL Server 2016 One of the most interesting use cases of Polybase is the ability to store historical data from relational databases into a Hadoop File System. The storage costs could be reduced while keeping the data accessible and still can be…

←Previous Page
1 2 3 4 5
Next Page→

Blog at WordPress.com.

Loading Comments...

    • Subscribe Subscribed
      • Paul Hernandez playing with Data
      • Join 42 other subscribers
      • Already have a WordPress.com account? Log in now.
    • Privacy
      • Paul Hernandez playing with Data
      • Subscribe Subscribed
      • Sign up
      • Log in
      • Report this content
      • View site in Reader
      • Manage subscriptions
      • Collapse this bar