Skip to main content

Posts

Showing posts from January, 2016

Why our Integration Tests were down

From some time, one of our projects has the Integration Tests on RED . It seems that randomly, some tests are failing. This usually happens only on the CI machine (Visual Studio Team Services). The strangest thing is the behavior. The behavior cannot be reproduced on the development machine. The reported errors are caused by asserts or strange exceptions - for example that an external resources doesn't exist. From time to time, the error can be reproduced on the local machine, but only once. The fail cannot be reproduced twice on the same machine. After a few sprints and a lot of time invested in this issue, we still had the same problem. After a review of <the problem,the code, how the tests were written>, the root causes were identified and isolated. Let's see what were the steps and causes of this problem. Step: Isolate the integration tests in a dedicated build. Why: You want to reduce the build time us much as possible and run only the tests that you want w

[Post Event] Refactoring and Clean Code Workshop, Sibiu, January 26, 2016

This week I had the opportunity to have a 3 hours workshop at Lucian Blaga University  where I talked about Refactoring and Code Smells in general. We also took a look on Visual Studio 2015 features that help us to refactor our code. Below, you can find the slides from my presentation. The code source I will not share because is ugly and smelly (smile). Radu vunvulea refactoring&amp;code smells from Radu Vunvulea See you next time!

Azure Table Performance - 1 vs 100.000 Tables under the same Storage Account

In our system we are using Azure Table to store a list of commands that needs to be send our clients and persisted until the client is available. Because the number of clients is high (more than 100.000), it would be very expensive to store the list of commands in other resources like Redis Cache or SQL Azure. From the performance perspective, Azure Table are amazing, very fast even at high throughput when you store a lot of data inside them. At the first version we done a simple mapping, where we had only one Azure Table for all our clients. For each client, we had a dedicated partition in the table. This works great because Azure Table is partitioned (scale) based on the partition. There is only a small problem with this approach and is related to maintenance and support. If a support engineering needs to look at the commands of a specific user it will be hard for him to navigate and access the data. The second approach is to create a different Azure Table for each client

Adding a build agent to TFS 2015

I was shocked how simple we can create and attach a new build agent to TFS 2015. There are only a few simple and small steps that we need to do. The new build system that comes with TFS 2015 and Visual Studio Team Services (Visual Studio Online) is not using anymore the old and classic XAML build definition. The new one allows you to define the build as a flow of events where you can specify at each step what do you want to happen. This is done easily  directly from the web portal. If you need to create or add a new build agent to your TFS 2015 infrastructure, than you will be shocked how easily this can be done. First step is to download from the "Agent Pools" tab the agent. Once you done this step, you will need to run the executable on the machine that will become the new Build Agent. I recommend to run the executable file from a command prompt and provide all the information that are required. You only need to know the TFS 2015 address and to have access to TFS as

Why the connection is not closing when I call Close() method of EventHubClient?

This days I saw an interesting question on internet. The response is interesting and should be well known by everyone that is using Event Hub or other services from Service Bus over AMQP protocol. Question: Why the connection is not closing when I call eventHubClient.Close() method? var factory = MessagingFactory .CreateFromConnectionString("@#$%"); var client = factory.CreateEventHubClient("rveh"); ... client.Close(); // or client.CloseAsync() By default, the communication protocol that is used by Event Hub is AMQP. AMQP protocol is using a connection over TCP. Opening and closing a AMQP connection is expensive, but the good part of AMQP is that we have a session that can be persisted and reused between different requests. When we are calling "Close" method, we are only notifying the MessageFactory that we don't need anymore the connection. It will not trigger a close/delete connection between our machine/system and EventHub. MessageFactory

Tools to migrate to/from Visual Studio Team Services (Visual Studio Online, TFS Online)

Visual Studio Team Service (known as VSO - Visual Studio Online) is a powerful tool that allows us to share code, track work, create build, deploy them and so on. Everything that you can do using TFS 2015, you can do now from Visual Studio Team Services - AS A SERVICE. We don't need anymore to install and manage TFS Server of build agents - life is easier. When you need to do a migration from on-premises to an external provider (in our case cloud), you need to see how you can: Migrate automatically, with minimal or no human intervention Migrate all change sets Migrate the task list This can be done easily, using " OpsHub Visual Studio Online Migration Utility ". This is a simple tool that allows us to migrate all our data from on-premises to cloud without any kind of problems. If you have custom templates, than you might need to do some custom configuration, but otherwise is working pretty smooth.  Another tool that should be used for situations like this i

(Part 1) Store and Share Sensitive Information - Current solutions

A few weeks ago I had to share admin and billing access keys of  an Azure account. This account is used by a product that is in production. Usually this is a simple task, you send an encrypted email with this information or you write them on a paper. But I sad NO, let's see why and what other solutions we could have. Sharing sensitive information is not a simple task and to store them in a save and secure way is harder. Sensitive information can be anything, from username and password, tokens, email and password or connection string. Basically anything that grands you access to a specific resource (Azure Account, WebApp, VM, Email Account and so on). For my personal content I use an encrypted USB, that is all the time kept in a secure location. This cannot be applicable when you have customer or production access keys. Let's see where you should never store this kind of information: On a piece of paper - anybody can see it and make a photo of it. On top of this, you