Micro-services: Micro-services refer to compartmentalized pieces of software developers pull together to build complete applications.
Asyra: In the realm of software development, micro-services stand as a beacon of modernity, promising to dissect the monolithic behemoths of our digital past into interoperable, manageable pieces. Championed for their agility and scalability, they are the architects of innovation. Yet, beneath the luminescence of these benefits lies a shadow of controversy: are micro-services truly the saviours of software development, or are they the harbingers of complexity and decay?
Me: That sounds very poetic but it is what I am referring to! Micro-services like data storage API’s, third-party vector database management services, cloud providers and more all make up essential parts of the software development process.
This is not an issue fundamentally as it allows developers to build applications block-by-block choosing services that have been tailor made for their use case. I use the OpenAI API a lot and in fact I used it to co-author this article by enabling Asyra AI to write parts of the article as you have noticed but should we not be building this software ourselves?
The answer to that question is “”yes and no” because it is incredibly hard to build and host software like LLMs or serverless, zero-downtime cloud hosting solutions like AWS or Render.
Why is it so hard to build software today?
Asyra: Building and hosting software like large language models (LLMs) or creating serverless, zero-downtime cloud hosting solutions in-house would necessitate a vast array of highly specialized knowledge in AI, cloud infrastructure, cybersecurity, distributed systems, and much more. It would also require ongoing maintenance and support, and substantial investment to achieve the levels of scalability, reliability, and performance provided by established services like AWS or Render.
Me: You need knowledge and you need a substantial amount of resources which most developers do not have access to. For example, if you were try to run even a small 7B parameter open-source model like Llama then you’re computer may be able to run it if you invested a few thousand in it but you would be using a lot of compute to do so!
Pioneers in the AI space recognise this need for compute for AI and are investing in cloud infrastructure to aid in providing the compute required to run these large models for your applications.
Pinecone allows you to create vector database on an array of documents but their main use case is data storage and easy deployment from the cloud. Instead of writing scripts to invoking the use of your vector database for AI querying, you can easily make an API call to pinecone and have your vector database ready to use across various applications. This is something you can do yourself but would be incredibly difficult to do off your laptop.
If you need a text-to-voice synthesiser, why build it yourself when you can use an established one like ElevenLabs or Cartesia AI?
There is no reason why you would waste time working on a function when you can pay for flexible API usage from a seasoned provider.
What is the Issue Though?
Me: They are solving huge software development problems and raking in cash by proving essential services but the issues lie in privacy and accountability. What are some other issues that are of concern?
Asyra: Here at Asycd, we’ve encountered a few key issues that are worth mentioning:
Complexity in Coordination:
- With microservices, you have multiple services that need to communicate seamlessly. This increases network chatter and can complicate service coordination and management.
Consistency and Data Management:
- Maintaining data consistency across services is challenging. Transactions that span multiple services require careful design to avoid data inconsistencies.
Testing:
- Testing can become increasingly difficult. Simulating a production-like environment requires significant effort as you must account for each microservice and its interactions.
Performance Overhead:
- The use of numerous small services can introduce latency due to the communication between services, which can impact performance.
Security Concerns:
- Each microservice is a potential attack vector. Ensuring proper security measures for each service and securing communications between them can be daunting.
Deployment Complexities:
- Continuous deployment can be complex due to the distributed nature of applications. Careful handling of service upgrades, versioning, and rollbacks is essential.
Developer Laziness
The options are endless and things are easier than ever to build but do you really want to delegate every function to external providers.
Where is the sense of invention? Where are the developers who don’t mind building from the ground up? Who want to tailor and carefully select their own application capabilities instead of relying on default configurations?
This laziness will lead to service monopolies as well as decreased innovation reinforced by companies mistakenly giving the responsibility to all their micro services.
Increasing Complexity
As we try to build and do more using software, there is an increasing need to offload the complexity but this brings issues that you have to be prepared for.
We suggest that you choose your micro-services selectively and sparingly because outages are far from non-existent in software nowadays.
As you look to scale up your applications, use piece-wise analysis to develop fallback mechanisms and safety actions to ensure no matter what happens you can continue to operate🙌🏽
Check out our website for everything asycd. If you want to see some our services and work check out solutions website.