Skip to main content


Showing posts from 2017

Tips and Tricks to prepare a demo for Azure Time Series Insights

Azure Time Series Insights it is a great tool that can be used to monitor in near-real time your systems and get useful insights about what is happening behind a scene.

In this post, we will talk about things that we need to consider before jumping to prepare a demo with it. There are behaviors that are normal when we go with Time Series Insights in production, but when we are preparing a demo or we want to push mocks data inside it can be annoying (especially when we do not understand the behavior).

I had to deliver 2 presentations and 3 demos about Time Series Insights and I learned in the hard way this kind of things, too long nights before the demos. Based on them I made a checklist that I hope to be useful to others also.

1. Ingress of mock events
One of the most important thing that we need to keep an eye on is the moment when we want to push mock data inside Time Series Insights. In general, we want in a short period to push data for a few days or weeks.
There are two tie…

Chart and dashboards from time perspective (real time, near-real time, consolidation)

In this post we will talk about how we can group charts, dashboards and reports in different categories based on how fast we need to be able to ingest and update data inside them.
Time is a relative term especially when you put it together with business insights and application reporting. There are two important aspects related to time from business and application insights perspective.

1. Time Granularity
The first one is related to the time granularity. In the beginning, most of the business stakeholders require the granularity to be as small as possible, until they realize that there is not too much inside of that and that in order to be able to understand something, the time granularity needs to be increased.
Most of the systems that are available now on the market allow us to change the time internal (granularity) on the fly, enabling us to navigate inside our data from a different time perspective.

2. Time Interval
The second aspect is the time interval, from the moment the data a…

[Post Event] THE APOLLO 13 OF IT, Targu-Mures, November 2017

On November 2018, Endava organized in Targu-Mures a local event dedicated to local community. More than 70 people came to the event to find more about Block chain, coding quality, CD and data privacy inside cloud.
It was a great community event where I meet smart and extraordinary people. I was impressed to find a strong IT community in Targu-Mures. At this community event I had the opportunity to talk about data privacy inside cloud.

Below you can find my session details together with session slides.
Abstract: Once you upload content to a cloud provider you lose the physical control of your data. Who is the real owner of your data from that moment on? You, as the legal owner of the data, or the cloud provider as the guardian? Join this session to find more about the topic. Our main purpose is to understand who ends up having power over the data.

Who owns data inside the cloud | Apollo 13, Endava, Targu-Mures, November 2017 f…

[Post Event] CloudBrew 2017, Mechelen (Belgium)

An impressive way to finish the working week. How? Attending to CloudBrew, a conference dedicated to Azure and cloud platform. Every year it's a pleasure for me to take part of such an event.
With only two tracks, it's give you the feeling that you are together with friends and you chat about Azure and beer (smile).
I had the opportunity to deliver an one hour session about Azure Time Series Insights and PowerBI. If you want to find more about this subjects, you can check the abstract and slides, below.
Title: Near-real time reporting in Azure Abstract: One of the most common requirement on a projects nowadays is real time monitoring and reporting. Easy to say, expensive to implement and complex to maintain. In this session we'll take a look on the Azure Services that enable us to fulfil this requirements with minimal effort and with maximum benefits. We have on our radar services like Azure Time Series Insights, Analysis Services and PowerBI. After this session you will kn…

[Post Event] Endava Tech Flow, November 2017, Cluj-Napoca

This week I had the opportunity to participate at Endava Tech Flow. The main topic of this event was AI and Machine Learning. There were more than 100 people that participate to this free event.  Beside local speakers, we had the opportunity to meet J.M Bishop (Director of the Tungsten Centre for Intelligent Data Analytics and Professor of Cognitive Computing at Goldsmiths, University of London) that talked about how dangerous artificial intelligence can be for us.

I also had a short talk about Machine Learning, where I tried to explain what is behind a ML and how does a neural network works. More about my session can be found below. 

Title: The science behind Machine Learning
Abstract: Did you ever ask yourself how Machine Learning works as a system? In this session we will take a look at the science behind Machine Learning systems. We will decrypt the base mathematical concepts that make systems like these work. Once you attend this session you will discover how polynomial expression…

IoT Home Automation | Backend infrastructure

Even if in the previous post I said that I will write about the integration points with garage gates, I decided to go forward with the development of the web interface and gateway software. I went on this path because once I have this part implemented I can play with ESP8266 as much as I want to integrate with all the devices that I have around the house.

Web Interface (UI)
I decided to go with a simple design. I’m pretty sure that I will become more complex in the future, but for now I will keep things as simple as possible. The web interface is done using AngularJS 5 together with a REST API exposed using ASP.NET Core. For now the web interface expose 2 buttons that allows me to trigger actions (open/close gates). This web interface is hosted as a web application inside an App Services. The plans is to secure using Azure AD, but this is another story, another post, but for now there is no need to do this because there is real device in the backend (yet).

The communicatio…

[Post Event] ISTAConf 2017, Sofia

Just finished my session at ISTAConf 2017 and I realize that we are not only talking about NoSQL and migration  strategy, but we even doing that. I just met some great guys from Sofia that started to migrate their relational database to MongoDB. What was awesome that they have in plan a migration from on-premises to Azure in the next 3-6 months for their system, so Azure Cosmos DB is the perfect suite for them.

It is the 7 edition of ISTA Conference, that started in 2011. Last time when I was here was 2 years ago and in comparison with that times they grow a lot. Not only during the keynotes, but also during the sessions the rooms are full with people that are curious to find more about what are the current trends and what future is preparing for us. What I like at ISTA is the format of the conference. Even if there are more than 750 participants, they don't have more than 3 tracks. allowing them to keep the quality of session at a high level.

At this conference I talk about Azure…

[Post Event] Meetup in Sofia together with Bulgarian BI & .Net User Group, November 2017, Sofia

Last I had the great opportunity to be invited by the local user group from Sofia (Bulgarian BI & .Net (BI & .Net Geeks) UG) to talk about Azure and Security at one of their meetups. It was such a great experience, meeting smart people that are full with energy and open to new technologies and ideas.

You can find below slides and content related to my session. Title: "Azure Enterprise Security in Practice"  Abstract: "How does an enterprise looks like when you talk about security and cloud? Complicated, rigid and challenging to accept cloud based architectures. After working closely with security teams from different companies I identified different approaches and requirements that are standard for enterprises. In this session I want to discuss and share with you lessons learned on how we can map security requirements to Azure." Slides:
Azure Enterprise Security in Practice Radu Vunvulea Codecamp Cluj Napoca Nov 2017 from Radu Vunvulea

[Post Event] ITDays 2017, Cluj-Napoca

This week it was the 5th edition of ITDays in Cluj-Napoca. This is a conference started small and become of the most important conferences in Cluj-napoca that take place in the second part of the year.
During the 2 days of the event there were more than 40 sessions delivered by around 40 speakers. Additional to this there were a lot of products and startups presentations that make us believe that there is something more than just code (smile). I had the opportunity to be invited as speaker at this conference where I talked about near-real time solutions that can be developed using Microsoft and Azure tools. More about my session can be found below:
Title: Near-real time reporting in Azure Abstract: One of the most common requirement on a projects nowadays is real time monitoring and reporting. Easy to say, expensive to implement and complex to maintain. In this session we'll take a look on the Azure Services that enable us to fulfil this requirements with minimal effort and with maxi…

[Post-Event] Codecamp 2017, Cluj-Napoca

This was the last Codecamp in Cluj-Napoca for 2017. With more than 1000 registered attendees and 50 speakers it was a great way to close the year. There were 10 tracks in parallel that made possible to have 70 sessions of 45 minutes in only one day.
I had the opportunity to deliver two sessions at this event, where I talks about how you need to mitigate security when you migrate from on-premises to Azure and Azure CosmosDB. Below you can find my session details and slides.

Title: "Azure Enterprise Security in Practice"
Abstract: "How does an enterprise looks like when you talk about security and cloud? Complicated, rigid and challenging to accept cloud based architectures. After working closely with security teams from different companies I identified different approaches and requirements that are standard for enterprises. In this session I want to discuss and share with you lessons learned on how we can map security requirements to Azure."

Azure Enterprise Sec…

IoT Home Automation | (1) Initial Overview

After playing a while with different IoT devices, I decided that it is a perfect moment to start to do some automation inside the house. Nothing special or fancy, just small things that improve the quality of life and force me to work on some hardware devices also.

What I want to achieve?
There are three different system that I want to connect and to be able to remote control and automate.
Garage doors and gates – I want to be able from my mobile phone to open or close the gates. In this moment, I have a remote control that control them. However, in the future I do not want to use them anymore, I want to replace them with only one device – my phone.
Alarm control – Usually when you close or open the gates you also need to arm or disarm the house alarm. Why not integrate this inside one system that would be able to open the gates and disarm the alarm automatically.
Automatically water system – Even if we love plants and flowers, it is a nightmare in the morning to water 20 minutes all the …

Lift and Shift - cloud migration strategy

During discussion on different cloud projects I observed that people are using “Lift and Shift” terminology with multiple meanings. This can create confusion between different parties, especially when technical team from each team understand a different thing.

What is Lift and Shift?
Lift and Shift is a migration strategy that is based on the concept of replication 1 to 1 of the environment that exist on-premises inside cloud (Microsoft Azure). This involves the migration of all computation, storage and other services without replacing them with specific Azure services.

What is not Lift and Shift?
When you have a File Server system on the current infrastructure. Lift and Shift in this case shall not include replacing it with Azure Files. Involves just taking the File Servers instances from on-premises and putting them inside Azure VMs.

Another good example is when you migrate a web farm. If you decide to do just a Lift and Shift, than you should just spin Azure VMs where you would use A…

Isolate Web Application from public internet (App Service)

In this post, we will talk about web endpoint security. Let us start from a basic requirement and see what we need to fulfil it.
The web application hosted inside App Services shall not be public available from internet.

The requirement is simple and clear, but can give us headaches if the team does not cover it from the beginning. Microsoft Azure is offering two options to fulfil it:

IP RestrictionsApp Service integration with VNETVNET with dedicated App Service Environment

IP Restrictions
App services is allowing us to specify a list of IPs that can access our web application. The feature is similar with the IP restriction functionality offered by IIS and can be configured inside web.config.
The difference between this two is the location where these checked is done. Using IP Restrictions the check is done a layer before the IIS. Additional this, the configuration can be done from Azure Portal or using ARM templates. There is no need to modify the configuration file of your appl…

Just deleted an Azure SQL Database by mistake! What's next?

There are times when you make mistakes, big mistakes like…
…deleting an Azure SQL Database…
In this post, we will take a look on what are the steps that needs to be done to recover and restore a deleted database. We are in the context where you have a Standard database, without any special features related to backups.

I just realized that I deleted the wrong database and I do not have any custom backups mechanism configured. What should I do?

Time is crucial 
Time is one of the most important factors. The backups of deleted databases is stored for a limited time. The time window depends based on the instance type. In this time window, you can restore a deleted database without any kind of problems. The retention policies for backups is 7 days for Basic and 35 days for Standard and Premium.
Azure SQL Server created automatically backups to your databases. These backups are used to restore a deleted database. Don’t forget that as for on-premises backups, things can go wrong during …

How to get your Azure Subscription Quotas and current Usage

In this post, we will talk about a simple problem inside Microsoft Azure:
How I can see what my quotas are for Azure Subscription that I am using?

When you start to use you Azure Subscription more than for playing you realize after a few weeks that you do not know what your current quotas limits are. Additional to this, it is not easy to count the number of instances for each resource.
For example if you use multiple Azure Storage in different Resource Groups, how easily you can count the number of Storage accounts that you are using?

To make the problem a little more complex, you should know that a part of these quotas are per Azure Region. For example, you have a default limit of 50 VNETs per Azure Region. It means that it would be pretty hard to calculate the total number of VNETs that you are using on each Azure Region. It is not impossible, but you would need to do some additional things.

Usage + Quotas
Inside Azure Portal, we have a dedicated tile that provide this informat…

[Post Event] Codecamp Timisoara, October 14, 2017

In October 14 I attended to Codecamp Timisoara. It was a great experience, where I met great people from Timisoara and other cities around Romania.
At this conference I delivered a session about enterprise security and how you mitigate different security aspects when you want to migrate to Microsoft Azure. Below, you can find more information related to it.

Title: Enterprise security in Practice
Abstract: How does an enterprise looks like when you talk about security and cloud? Complicated, rigid and challenging to accept cloud based architectures. After working closely with security teams from different companies I identified different approaches and requirements that are standard for enterprises. In this session I want to discuss and share with you lessons learned on how we can map security requirements to Azure.

Enterprise security in Practice from Radu Vunvulea

Migrating File Server (Share) to Azure

Let’s take a look on what are the current solution on Azure when we want to migrate our local file storage from on-premises to cloud.

On the current system, let us imagine that we have Windows File Server that it is used to share filers inside the company. The File Server is fully integrated with our Active Directory server and based on roles we allow/restrict access to different folders/files.

What we want to do?
We want to migrate to Azure this solution in such a way that we don’t need to manage the File Server machines and also to be able to have control on file sharing permissions using user roles (Active Directory integration).
Addition to this, we want to be able to attach as a shared folder or partition the shared content on the client machine.

Azure Files
An extremely powerful solution provided by Microsoft, which allow us to store our files in Azure and share them with others. The SMB protocol is fully supported, meaning that we can attach the shared on our client machin…

The real difference between an Azure VM with or without SSD

I want to talk about the real difference of an Azure VM with or without SSD. This is not a post with charts and artificial benchmarks; it is just a real story from the field.
One of my fellows from my work came at me complaining about a performance issue related to SQL Server. On an Azure VM with Linux they used to have an SQL Server instance. The DB storage size was not to complex and the DB size was acceptable.
Every few hours a job has to be executed on the database. There is a lot of data processing inside it and it usually takes around 1 hour. From duration perspective this is not acceptable, there is a clear NFR that request the task to be executed under 30 minutes.
An audit was done to the VM and database and it was pretty clear that there is a problem at read and write operations. Many actions were happening at that level, causing the memory and storage to be at high levels.
The DB specialists reviewed the database structure and the job. Unfortunately, the…

Using SQL security features to isolate sensitive data inside a PoC

When writing a PoC you need to keep it as simple as possible and prove that from a technology perspective the project vision is feasible and is the right one. One of the fundamentals rules of a PoC is that it needs to cover things that are not general truth (e.g. You don’t want to prove that ASP.NET MVC can render HTML or expose an REST API).
Keeping a PoC as simple as possible can become a problem when you want to use customer data not only mocks data. When you have customer sensitive information, which should not be visible even to the development team you might end up in a strange situation.
The problem is not related on how you can do this. The biggest problem is the effort that you need to invest to create the deployment scripts or the automation mechanism that would allow the customer to deploy the solution in an isolated environment, where development team would not have access. This effort might require extra development effort that you don’t want to include in a PoC.
It is cl…

Microsoft Ignite in a glance (Orladon, FL, 2017)

What a week! Last week of September was crazy from all the people that are working with Microsoft stack. The biggest Microsoft conference took place in Orlando, Florida. More than 25.000 people attended this year at Microsoft Ignite and as usually, it was an event will all the tickets sold out.

There were many announcement that makes Microsoft a strong player for today needs, but also there is a clear vision where they want to go. Not only this, but it seems that the road is already defined and clear.
The currents needs of the market are covered by Microsoft with Azure Stack, offering a good ground for hybrid solutions. Now we can use the core services of Microsoft Azure not only on Azure, but also on our on-premises infrastructure using Azure Stack. What is more interesting from devops and IT perspective is that you have the same experience, the same dashboard and you use the same scripts (no change is required).
Mixed reality and AI are now more closers to the fields. Many companies…