Cloud Computing Solutions

Explore top LinkedIn content from expert professionals.

  • View profile for Sean Connelly🦉
    Sean Connelly🦉 Sean Connelly🦉 is an Influencer

    Zscaler | Fmr CISA - Zero Trust Director & TIC Program Manager | NIST 800-207 ZTA co-author

    21,982 followers

    🚨NSA Releases Guidance on Hybrid and Multi-Cloud Environments🚨 The National Security Agency (NSA) recently published an important Cybersecurity Information Sheet (CSI): "Account for Complexities Introduced by Hybrid Cloud and Multi-Cloud Environments." As organizations increasingly adopt hybrid and multi-cloud strategies to enhance flexibility and scalability, understanding the complexities of these environments is crucial for securing digital assets. This CSI provides a comprehensive overview of the unique challenges presented by hybrid and multi-cloud setups. Key Insights Include: 🛠️ Operational Complexities: Addressing the knowledge and skill gaps that arise from managing diverse cloud environments and the potential for security gaps due to operational siloes. 🔗 Network Protections: Implementing Zero Trust principles to minimize data flows and secure communications across cloud environments. 🔑 Identity and Access Management (IAM): Ensuring robust identity management and access control across cloud platforms, adhering to the principle of least privilege. 📊 Logging and Monitoring: Centralizing log management for improved visibility and threat detection across hybrid and multi-cloud infrastructures. 🚑 Disaster Recovery: Utilizing multi-cloud strategies to ensure redundancy and resilience, facilitating rapid recovery from outages or cyber incidents. 📜 Compliance: Applying policy as code to ensure uniform security and compliance practices across all cloud environments. The guide also emphasizes the strategic use of Infrastructure as Code (IaC) to streamline cloud deployments and the importance of continuous education to keep pace with evolving cloud technologies. As organizations navigate the complexities of hybrid and multi-cloud strategies, this CSI provides valuable insights into securing cloud infrastructures against the backdrop of increasing cyber threats. Embracing these practices not only fortifies defenses but also ensures a scalable, compliant, and efficient cloud ecosystem. Read NSA's full guidance here: https://lnkd.in/eFfCSq5R #cybersecurity #innovation #ZeroTrust #cloudcomputing #programming #future #bigdata #softwareengineering

  • View profile for David Linthicum

    Top 10 Global Cloud & AI Influencer | Enterprise Tech Innovator | Strategic Board & Advisory Member | Trusted Technology Strategy Advisor | 5x Bestselling Author, Educator & Speaker

    191,770 followers

    🌍 The Shift in Europe: Moving Away from US Hyperscalers 🌩️ As geopolitical concerns, data sovereignty, and pricing instability grow, European companies are making bold moves in their cloud strategies—and the implications are massive. Over the past 15 years, reliance on public cloud giants like AWS, Microsoft, and Google has skyrocketed. But now, we’re seeing a strategic pivot unfolding across Europe, as organizations mitigate risks and embrace alternative solutions to protect their future. 🎯 Why the shift? ✅ Data Sovereignty: Stricter data protection laws like GDPR and fears over compliance with laws like the US CLOUD Act are driving demand for European-managed cloud solutions and sovereign cloud providers. Organizations are prioritizing control over their sensitive data and leaning into platforms that support their unique privacy needs. ✅ Security and Trust: Concerns over potential government interference, espionage, and vendor lock-in are making European businesses rethink their current reliance on US-based hyperscalers. The rising interest in diverse, multi-cloud strategies and locally governed services reflects the growing importance of trust in cloud decisions. ✅ Economic Predictability: Increasing costs from hyperscalers have raised concerns about long-term pricing stability. Enterprises are recognizing that forward-looking cloud strategies need to include providers that prioritize pricing transparency and tailored solutions. 🎯 What’s the result? A diverse and dynamic cloud ecosystem is emerging in Europe, leaning on open-source technologies, sovereign cloud providers, and tailored private cloud solutions. Platforms like OpenStack and others are paving the way for digital transformation without compromising on compliance or strategy. As businesses explore these new approaches, multi-cloud strategies, hybrid environments, and innovative pricing models are becoming essential for mitigating risks and staying competitive within an ever-evolving cloud landscape. 📢 This shift isn’t just about technology—it’s about geopolitics, trust, and long-term business resilience. Let’s embrace a future where diversity in cloud ecosystems fosters innovation, enhances security, and ensures sovereignty. What are your thoughts on this shift towards sovereign and multi-cloud solutions? 💭 Let’s discuss! #CloudComputing #DataSovereignty #SovereignCloud #MultiCloud #Geopolitics #Innovation

    Why Europe Is Fleeing The Cloud

    https://www.youtube.com/

  • View profile for SHAILJA MISHRA🟢

    Data and Applied Scientist 2 at Microsoft | Top Data Science Voice |175k+ on LinkedIn

    180,792 followers

    Imagine you have 5 TB of data stored in Azure Data Lake Storage Gen2 — this data includes 500 million records and 100 columns, stored in a CSV format. Now, your business use case is simple: ✅ Fetch data for 1 specific city out of 100 cities ✅ Retrieve only 10 columns out of the 100 Assuming data is evenly distributed, that means: 📉 You only need 1% of the rows and 10% of the columns, 📦 Which is ~0.1% of the entire dataset, or roughly 5 GB. Now let’s run a query using Azure Synapse Analytics - Serverless SQL Pool. 🧨 Worst Case: If you're querying the raw CSV file without compression or partitioning, Synapse will scan the entire 5 TB. 💸 The cost is $5 per TB scanned, so you pay $25 for this query. That’s expensive for such a small slice of data! 🔧 Now, let’s optimize: ✅ Convert the data into Parquet format – a columnar storage file type 📉 This reduces your storage size to ~2 TB (or even less with Snappy compression) ✅ Partition the data by city, so that each city has its own folder Now when you run the query: You're only scanning 1 partition (1 city) → ~20 GB You only need 10 columns out of 100 → 10% of 20 GB = 2 GB 💰 Query cost? Just $0.01 💡 What did we apply? Column Pruning by using Parquet Row Pruning via Partitioning Compression to save storage and scan cost That’s 2500x cheaper than the original query! 👉 This is how knowing the internals of Azure’s big data services can drastically reduce cost and improve performance. #Azure #DataLake #AzureSynapse #BigData #DataEngineering #CloudOptimization #Parquet #Partitioning #CostSaving #ServerlessSQL

  • View profile for Louis C.
    Louis C. Louis C. is an Influencer

    Marketing & Product Mgmt. Leader | LinkedIn Top Voice | Software Expertise in AI, Analytics, ERP, Cloud, CPQ & Cybersecurity

    10,336 followers

    𝗧𝗵𝗲 𝘄𝗮𝗹𝗹𝗲𝗱 𝗴𝗮𝗿𝗱𝗲𝗻 𝗰𝗿𝗮𝗰𝗸𝘀: 𝗡𝗮𝗱𝗲𝗹𝗹𝗮 𝗯𝗲𝘁𝘀 𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁’𝘀 𝗖𝗼𝗽𝗶𝗹𝗼𝘁𝘀 𝗮𝗻𝗱 𝗔𝘇𝘂𝗿𝗲’𝘀 𝗻𝗲𝘅𝘁 𝗮𝗰𝘁 𝗼𝗻 𝗔𝟮𝗔 & 𝗠𝗖𝗣 𝗶𝗻𝘁𝗲𝗿𝗼𝗽𝗲𝗿𝗮𝗯𝗶𝗹𝗶𝘁𝘆 Microsoft CEO Satya Nadella is redefining cloud competition by moving away from Azure's traditional "walled garden." The new strategy: supporting open protocols like Google DeepMind's Agent2Agent (A2A) and Anthropic's Multi-Cloud Platform (MCP), positioning Microsoft Azure Copilots and AI services for broad interoperability across cloud environments, including Amazon Web Services (AWS), Google Cloud, and private data centers. From my recent VentureBeat analysis, here are three reasons this shift matters: 💡 Strategic Inflection Point: Microsoft is publicly endorsing and implementing A2A and MCP, aiming to make Azure a hub for genuine agent-to-agent interoperability across the industry. 📈 Enterprise Agility: By embracing open standards, Microsoft is reducing vendor lock-in and giving organizations greater freedom to innovate and manage AI workloads wherever they choose. ⚙️ Technical Enablement: Azure's Copilots and AI platforms, such as Copilot Studio and Azure AI Foundry, are being built with open APIs and integration frameworks, simplifying and accelerating multi-cloud operations and adoption of interoperable AI solutions. 𝗕𝗼𝘁𝘁𝗼𝗺 𝗹𝗶𝗻𝗲: The era of isolated clouds is coming to an end, and Microsoft is positioning itself as a key catalyst in that transformation. Full analysis linked in the first comment. #Azure #AI #MultiCloud #Interoperability #CloudStrategy #EnterpriseTech #Microsoft

  • View profile for Darren Grayson Chng

    Regional Director | Privacy, AI, Cyber | Former Regulator | AI Law & IEEE AI Peer Reviewer | ISO 42001, AIGP

    9,838 followers

    Here's the last post sharing what I spoke about during PDP Week. Our moderator Christopher (2024 Global Vanguard Award for Asia) comes up with the most creative titles for panel discussions. He called this one 'Weather Forecast: Cloudy with a Chance of Breach'. Together with Aparna and Abhishek, we talked about privacy and security in the cloud. 1. Who do you typically engage with IRT privacy and security for the cloud? I wanted to dispel the misconception that if a company engages a cloud service provider (CSP) to store your data, they are responsible for privacy and security, and the company doesn't need to do anything. Generally, the cloud customer is still responsible for security in the cloud e.g. configuring user access to data, services that the customer uses. The CSP is responsible for security of the cloud e.g. physical protection of servers, patching flaws. This is known as "shared responsibility" between the CSP and cloud customer. The extent of each party's responsibilities depend on the deployment used e.g. SaaS, PaaS, IaaS. 2. Shared responsibility also applies within organisations e.g. - IT helps with technical implementation and maintenance of cloud services - IT security helps protect data from unauthorised access - Privacy, Legal, and Compliance provide guidance on compliance with laws, and ensure that contracts with CSPs and vendors include privacy and security clauses 3. What tools/processes are involved in privacy considerations for securing cloud use? They include a Privacy Impact Assessment when e.g. new cloud services are used to process sensitive data, when cloud use involves data transfers to various countries. Privacy management tools include encryption, anonymisation, pseudonymisation, access controls. CSPs usually make audit reports available to prospective and current customers, you can request for them. Also, have a well defined incident response plan. 4. How do you implement and manage breach or incident response for the multi-cloud? Multi-cloud environments can be challenging, because each CSP may have its own set of interfaces, tools, processes for incident response. You need to develop a unified incident response framework that can be applied across all cloud providers, which defines standard procedures for detecting, reporting, and responding to incidents, and which can enable collaboration between different cloud environments. The framework must facilitate internal coordination between various teams, as well as external coordination with CSPs. CSPs play a critical role in incident response, as they control the infrastructure and have visibility into their own environments. Ensure that roles and responsibilities are clearly defined, that you understand your legal obligations IRT breach notification e.g. who you need to notify and by when. Get corp comms' help with communication strategies vis-a-vis affected parties, regulators, staff, and other stakeholders. #APF24

  • View profile for Helen Orgis
    Helen Orgis Helen Orgis is an Influencer

    Director Strategic Alliances I Caring about Great Partnerships I Helping B2B Startups Master Their Go-To-Market | Soon-to-be ADHD Coach | TenMoreIn Alumni

    7,625 followers

    𝗪𝗵𝘆 𝗮𝗻 𝟭𝟭 𝗕𝗶𝗹𝗹𝗶𝗼𝗻 𝗘𝘂𝗿𝗼 𝗜𝗻𝘃𝗲𝘀𝘁 𝗶𝗻 𝘁𝗵𝗲 𝗦𝗽𝗿𝗲𝗲𝘄𝗮𝗹𝗱 𝗦𝗵𝗼𝘂𝗹𝗱 𝗠𝗮𝘁𝘁𝗲𝗿 𝘁𝗼 𝗘𝘃𝗲𝗿𝘆 𝗘𝘂𝗿𝗼𝗽𝗲𝗮𝗻 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗟𝗲𝗮𝗱𝗲𝗿❓ I was in Munich last week for the ServiceNow World Tour, and the enormous interest in the "Digital Sovereignity for Europe" breakout with Schwarz Group and STACKIT was palpable with people standing to even get to see the session. It's clear that #DigitalSovereignty has moved from a regulatory buzzword to a CEO-level strategic imperative. Now, with Schwarz Digits announcing a massive 11 Billion investment in a new AI and data center in #Lübbenau, Germany (where STACKIT will operate its 5th facility), the European tech landscape is 𝘧𝘪𝘯𝘢𝘭𝘭𝘺 taking actions. Here's my take on the new dynamics and what decision-makers need to know: 1️⃣ 𝗧𝗵𝗲 𝗘𝘂𝗿𝗼𝗽𝗲𝗮𝗻 𝗖𝗼𝘂𝗻𝘁𝗲𝗿-𝗔𝘁𝘁𝗮𝗰𝗸: The 11B investment is a direct challenge to US Hyperscalers. It's about more than just physical infrastructure. It's about building an independent, high-performance platform for AI and cloud that is governed entirely by EU law (GDPR-compliant, protected from the US CLOUD Act). This is about choice and control for European enterprises. I would say 𝘣𝘦𝘵𝘵𝘦𝘳 𝘭𝘢𝘵𝘦𝘳 𝘵𝘩𝘢𝘯 𝘯𝘦𝘷𝘦𝘳! 2️⃣ 𝗧𝗵𝗲 𝗛𝘆𝗽𝗲𝗿𝘀𝗰𝗮𝗹𝗲𝗿 𝗦𝘁𝗿𝗮𝘁𝗲𝗴𝘆 𝗣𝗶𝘃𝗼𝘁: Major US software companies are adapting to keep a multi-trillion-dollar market. The old way was 'one cloud fits all.' The new alliance model is 'sovereignty-by-design.' They are now considering to partner with trusted European infrastructure providers like #STACKIT (part of Schwarz Digits) to offer Sovereign Cloud solutions. New alliances are forming! 3️⃣ 𝗧𝗵𝗲 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝗡𝗼𝘄 𝗕𝗹𝘂𝗲𝗽𝗿𝗶𝗻𝘁: This is where the rubber meets the road. The partnership between 𝘚𝘦𝘳𝘷𝘪𝘤𝘦𝘕𝘰𝘸 𝘰𝘯 𝘚𝘵𝘢𝘤𝘬𝘐𝘛 is a prime example. It allows businesses to leverage the power of ServiceNow's AI platform (with e.g. full feature parity) while ensuring all data is hosted and processed securely within the StackIT cloud, meeting the European data and compliance requirements. It's a pragmatic path to both innovation and sovereignty. 👉🏽 That clearly shows that digital sovereignty is not about closing the door! It's about building our own foundation. The combination of local investment, strategic alliances, and platforms like ServiceNow on StackIT is creating a resilient and competitive digital future for Europe. What is your organization doing to secure its digital future while maintaining its sovereignty?

  • View profile for Pratik Gosawi

    Senior Data Engineer | LinkedIn Top Voice '24 | AWS Community Builder | Freelance Big Data and AWS Trainer

    20,542 followers

    ETL vs ELT in Data Engineering ETL: Extract, Transform, Load ETL is the traditional approach: 1. Extract:  ↳ Data is extracted from source systems. 2. Transform:  ↳ Extracted data is transformed (cleaned, formatted, etc.) in a staging area. 3. Load:  ↳ Transformed data is loaded into the target system (usually a data warehouse).  Pros of ETL: - Data is cleaned and transformed before loading, ensuring high-quality data in the target system. - Reduces storage requirements in the target system as only relevant data is loaded. - Better for complex transformations that require significant processing power. - Ideal for systems with limited computing resources at the destination.  Cons of ETL: - Can be slower due to transformation before loading. - May require more processing power in the intermediate stage. - Less flexible if transformation requirements change frequently.  Use Cases for ETL: - Working with legacy systems that require specific data formats. - Data quality is a critical concern and needs to be addressed before loading. - Target system has limited computing resources. ELT: Extract, Load, Transform ELT is a more modern approach: 1. Extract:  ↳ Data is extracted from source systems. 2. Load:  ↳ The raw data is loaded directly into the target system. 3. Transform:  ↳ Data is transformed within the target system as needed.  Pros of ELT: - Faster initial load of data as there's no transformation before loading. - More flexible, allowing for transformations to be modified without reloading data. - Takes advantage of the target system's power - Raw data is preserved, allowing for different transformations as needs change.  Cons of ELT: - More storage is required in the target system as all raw data is loaded. - May result in lower-quality data in the target system if not managed - Can be more complex to implement and manage.  Use Cases for ELT: - Working with cloud-based data warehouses. - Flexibility is needed for transformations on the same dataset. - Target system has significant computing resources. Real-World Example: Customer Analytics Platform on AWS Consider a real-world scenario where a retail org wants to build customer analytics platform using AWS  ETL Architecture: 1. Extract:    - Use AWS DMS to extract data from on-premises DB.   - Use Glue crawlers to catalog data from S3 containing log files and other semi-structured data. 2. Transform:   - Use AWS Glue ETL jobs to transform the data. 3. Load:   - Load the transformed data into Redshift, a DWH optimized for analytics. 4. Orchestration:   - Use AWS Step Functions to orchestrate the entire ETL pipeline.  ELT Architecture: 1. Extract:   - DMS and Glue crawlers for data extraction. 2. Load:   - Load raw data into S3 data lake. 3. Transform:   - Use Athena for on-demand SQL transformations.   - Use Redshift Spectrum to query both structured data in Redshift and unstructured data in S3. 4. Orchestration:   - Use AWS Glue to manage the ELT process.

  • View profile for Mani Chandrasekaran
    Mani Chandrasekaran Mani Chandrasekaran is an Influencer

    Field CTO and Enterprise Technologist at AWS India & South Asia | Cloud Architecture, Gen AI, Product, App Modernization | Independent Director (IICA) | Certifications - All AWS, Kubernetes, GCP , Azure, nvidia & CCSP

    17,819 followers

    I'm always on the lookout for "AWS" scale customer case studies 😎 !! This recent blog about how Ancestry tackled one of the most impressive data engineering challenges I've seen recently - optimizing a 100-billion-row Apache Iceberg table that processes 7 million changes every hour. The scale alone is staggering, but what's more impressive is their 75% cost reduction achievement. 𝐓𝐡𝐞 𝐀𝐖𝐒-𝐏𝐨𝐰𝐞𝐫𝐞𝐝 𝐒𝐨𝐥𝐮𝐭𝐢𝐨𝐧 Their architecture combines Amazon EMR on EC2 for Spark processing, Amazon S3 for data lake storage, and AWS Glue Catalog for metadata management. This replaced a fragmented ecosystem where teams were independently accessing data through direct service calls and Kafka subscriptions, creating unnecessary duplication and system load. 𝐖𝐡𝐲 𝐈𝐜𝐞𝐛𝐞𝐫𝐠 𝐌𝐚𝐝𝐞 𝐭𝐡𝐞 𝐃𝐢𝐟𝐟𝐞𝐫𝐞𝐧𝐜𝐞 Apache Iceberg's ACID transactions, schema evolution, and partition evolution capabilities proved essential at this scale. The team implemented merge-on-read strategy and Storage-Partitioned Joins to eliminate expensive shuffle operations, while custom partitioning on hint status and type dramatically reduced data scanning during queries. 𝐄𝐧𝐭𝐞𝐫𝐩𝐫𝐢𝐬𝐞-𝐒𝐜𝐚𝐥𝐞 𝐑𝐞𝐬𝐮𝐥𝐭𝐬 This solution now serves diverse analytical workloads - from data scientists training recommendation models to geneticists developing population studies - all from a single source of truth. It demonstrates how modern table formats combined with AWS managed services can handle unprecedented data scale while maintaining performance and controlling costs. More details in the blog at https://lnkd.in/gN-mvdUE #bigdata #iceberg #aws #ancestry #analytics #scale #apache

Explore categories