Saying that I’m from the time of the MDAC would be to break the main rule of never reveal our age. However, who really remembers Microsoft Data Access Components – MDAC – today ?
Microsoft had ODBC and it was good. Out of the blue came Borland with a thing called BDE that was way faster and came with Delphi, a direct competitor of Visual Basic at that time. Microsoft had to be fast creating a new technology. That’s when MDAC was created.
Things were a bit crazy considering how fast MDAC was created. Windows brought one version of MDAC, lets suppose it was 2.1. Office and Access brought a different one, let’s suppose 2.3. SQL Server had 2.6 and Visual Basic 2.8. Of course the version numbers are not correct, this happened about 30 years ago, but you got the idea. It was craziness to understand what version was where.
Don’t know what was MDAC? What if I call it ADO, is it more familiar to you? Yes, I’m talking about ADO and it’s that old.
That was also the time of the big conferences. I was living in Rio and that was the first time I was getting a plane to attend Microsoft TechEd in São Paulo.
It’s difficult to compare that time with today. The speaker told us he was creating the demo scenario for the conference in the plane and noticed how difficult it was to update COM classes. He created an addIn for Visual Basic – still in the plane – unregister, compile and register the COM class again, that was a huge improvement in COM+ development and became part of the product less than a year later.
This same speaker addressed on the stage the problem of the multiple MDAC versions attached to different products. His sentence was something similar to this:
“We know the pain this caused to you, we know it was a mess. We will never release new versions this way again. New versions will only be released with the entire product”
After Some Years
At some point in the end of 2001 and beginning of 2002 Microsoft suffered from some severe security problems on its products. The problem was so severe and exposed that Microsoft stopped all the software development during the entire month of February 2002, making every developer focus on security check of the code.
I hope not be mixing dates, but it was around this time I remember about a video explaining some pieces of the software development process in Microsoft. What most impressed me was the explanation that after a day of development, all the software was integrated, expend the entire night being tested and the developers had access to the result of the tests in the morning.
Having this level of automated tests, the possibility to leave small bugs behind would be low. Only hard to find ones would pass.
The image below, of course, is from many years ago.
The today World
Let’s talk about the today world based on examples.
LocationMode for Azure Storage access
LocationMode was a great property in the Azure Storage. The storage has replication by default and the LocationMode property allows you to use the replicated storage to balance workload with the primary storage, or other purposes.
You can see details about the LocationMode property on this page
The problem: The last version of the library with this property is the version 11 of the library. Version 12 of the library doesn’t have this property. A documentation page on Microsoft website says “We are working on producing documentation for this”, but the property is just no there.
Durable Functions
Durable functions allow us to create many different long running architectures using Azure Functions environment. The problem: only .NET Core 3.1 LTS supports it. The .NET 5 or .NET 6 (until this month) don’t support it. There is one version of .NET 6 expected to become available this month that will make durable functions available again.
All this creates a problem: A version evolution is not a version evolution anymore. If we would like to enjoy the new version benefits we end up having to leave some feature behind. Is this something good?
It may worth read the following blog about LTS support and its meaning. I leave up to you to commend about the fact it resembles too much the name of Extended Support, becoming one more technical detail to learn and make everyone confuse.
System.Data.SqlClient and Microsoft.Data.SqlClient
Microsoft.Data.SqlClient is replacing System.Data.SqlClient, but it’s being made in a way that both live together, side by side. What are the chances for this to go wrong?
Microsoft.Data.SqlClient receives all new features implementations. For example, authentication with managed identities is supported only by this library. However, using this library isn’t the most straightforward process. A simple google query (sin! mortal sin!) would show how easy it is to reach a dependency hell if you try to use it with .NET 5 in an Azure function. This blog is an example. Proposed solution? Downgrade .NET.
It’s not only about Development
SAS Policies on containers
Do you know you can create a Shared Access Policy in an Azure container storage and this shared access policy should work as a guidance to produce SAS keys for the Azure Storage?
Yes, you can ! But no one knows, due to one simple reason: You can’t produce the SAS keys based on the SAS policy using the portal. Azure Storage Explorer can do this, but most people will miss and only look for the feature in the portal.
This plus the user being able to just ignore the SAS policy and create an SAS key in the way would like makes this feature… a bit strange.
The Hierarchical Namespace hell
In an Azure Storage account, we can enable Hierarchical Namespace or not. Can you correctly tell what features require Hierarchical Namespaces enabled and what features require it disable?
I know some features that require it disabled: Azure SQL Extended Events and temporary storage for backup restore are two of them. Things become even more strange when new features in Azure Storage, such as soft delete, only work with it disabled. What is this option for? Data Lakes. But Polybase and its flavors from Synapse and SQL can work well without this feature enabled. Confuse?
Conclusion
This list could be going and going. After finishing I may have thought about many more items that could figure here.
I don’t have answers, only questions. 30 years have passed, but does this justify that we go on the exactly opposite direction than the promised one in that conference in São Paulo? What about the night integration and tests, are they still happening ?
Young developers may hold pride for being able to walk through a dependency hell. Knowledge brings pride, but this knowledge shouldn’t, because in the end, it’s still a dependency hell.
Wouldn’t it be so easier if it just worked?
The post Dependency Hell: Past and Future appeared first on Simple Talk.
from Simple Talk https://ift.tt/3CnVtxn
via
No comments:
Post a Comment