Tag Archive for: artificial intelligence

The AWS Summit New York 2024 was an exhilarating event to showcase cloud innovation, AI advancements, and industry best practices. At this action-packed day hosted at the Jacob K. Javits Convention Center, this year’s Summit brought together thousands of professionals, technology enthusiasts, and AWS experts to explore how cutting-edge AWS technologies can be used to revolutionize industries and empower businesses.

At this year’s Summit, over 170 sessions were offered covering a wide range of topics and technical depth, ranging from level 100 (foundational), level 200 (intermediate), level 300 (advanced), and level 400 (expert). Within these sessions, many AWS experts, builders, customers, and partners shared their insights on numerous topics such as generative AI, analytics, machine learning, industry specific solutions, and many more. Individuals were able to customize their own agenda ahead of time and choose from lecture-style presentations, peer-led discussions, and explore the Expo to learn about the numerous advancements of AWS technologies and deepen understanding of best practices. Dr. Matt Wood, VP for AI Products, AWS, hosted the keynote session to unveil the latest launches and technical innovations from AWS and demonstrate products and real-world success stories from AWS customers.

Below is a detailed look at some of my key takeaways and trends that summarizes this year’s Summit:

1. Amazon Bedrock

Stemming from the heavy emphasis on generative AI and its capabilities, one of the most exciting announcements from the Summit was the introduction of new capabilities in Amazon Bedrock. Amazon Bedrock is AWS’s relatively new service designed to simplify the creation of AI applications. The service provides access to pre-trained foundation models from leading AI providers, and enables businesses to build, deploy, and scale AI-driven solutions without deep expertise and extensive effort. In addition, the many key features of Amazon Bedrock allow users and businesses to build innovative AI solutions effectively and efficiently while ensuring scalability and compliance. The fundamental idea of this service is to revolutionize how companies develop and deploy generative AI applications, making it easier to integrate cutting-edge technology into existing workflow while significantly reducing computational costs. 

At this year’s Summit, additional features of Amazon Bedrock were introduced to enhance company knowledge bases with new Amazon Bedrock connectors for Confluence, Salesforce, SharePoint, and web domains. In doing so, companies can empower RAG models with contextual data for more accurate and relevant responses. 

Lastly, Guard Rails and Guard Rails API were introduced for Amazon Bedrock to contribute to the following:

  • Bring a consistent level of AI safety across all applications
  • Block undesirable topics in generative AI applications
  • Filter harmful content based on responsible AI policies
  • Redact sensitive information (PII) to protect privacy
  • Block inappropriate content with a custom word filter
  • Detect hallucinations in model responses using contextual grounding checks

Businesses and customers can apply safeguards to generative AI applications even if those models are hosted outside of AWS infrastructure. It is estimated that up to 85% of harmful content can be reduced with custom Guardrails.

2. Fannie Mae’s Data Science Platform

One of the first sessions that I attended was Fannie Mae’s presentation on their data science platform. The focus was on how Fannie Mae overcame traditional data management challenges through innovative solutions. Data scientists at Fannie Mae were responsible for exploring internal and external datasets, including sensitive data to develop and train models, create reports and new datasets, deploy models, and share insights. Before the utilization of AI, Fannie Mae’s data scientists struggled with data access (mostly personally identifiable information), governance, and operationalization. In addition, underwriting analysts spent significant time extracting structured data from unstructured documents. On average, each analyst spent 5 hours on every document, with over 8,000 underwriting documents per year. The challenge of inefficient manual document analysis was also resolved by the utilization of AI.

By leveraging Large Language Models (LLMs) and ontologies, Fannie Mae developed a knowledge extraction system that significantly reduced manual effort. Tools like Amazon Bedrock, Claude 3 Sonnet, Amazon Neptune, LangChain, and Amazon OpenSearch Service played a crucial role in this transformation. The use of AI has generated a potential savings of over 32,000 hours annually and improvements in accuracy, compliance, and scalability of underwriting analysis for Fannie Mae.

Such efficiency and savings generated by the use of LLMs and ontologies is simply fascinating. This is a great reflection on how companies of all sectors can utilize the diverse capabilities of AI and customizable machine learning models to generate value.

3. IBM WatsonX & AWS: Scale Gen AI Impact with Trusted Data

Generative AI was a major theme at the Summit, and IBM WatsonX and AWS highlighted their collaborative efforts to expand the impact of this technology. The WatsonX suite offers tools like Watsonx.ai for model development, Watsonx.data for scaling AI workloads, and Watsonx.governance for ensuring responsible AI practices. This partnership brings a shift towards more open, targeted, and cost-effective generative AI solutions, while offering superior price-performance at less than 60% of the traditional costs.

4. Advancing AI and Cloud Solutions

Another key topic of the Summit was Innovating with Generative AI on AWS. This topic highlights how businesses can focus on performance, cost-efficiency, and ethical responsibilities in AI development. Many strategies were discussed for creating new customer experiences, boosting productivity, and optimizing business processes through generative AI.

Some of the key techniques included Retrieval Augmented Generation (RAG) for combining new and existing information, fine-tuning of AI models, and pre-training to enhance AI capabilities. The session emphasized the importance of accessible and high-quality data as the foundation for AI success, so that businesses can utilize generative AI to its maximum potential to drive innovation and create value. By using services designed to enable innovation and scale, businesses are able to measure and track value and ROI while optimizing for cost, latency, and accuracy needs. In addition, businesses can manage risk, maintain trust, and build with compliance and governance.

5. Boosting Employee Productivity with AI Agents

Another highlight was the exploration of AI agents powered by Amazon Q. With Amazon Q, businesses can design these AI agents to integrate seamlessly with tools like Slack, Microsoft Teams,  and other AWS-supported data sources to enhance employee productivity. These AI agents can improve efficiency across teams and organizations by streamlining data interactions and automating repetitive tasks. A demo of how to connect the Slack instance to Amazon Q and deploy it into the Slack workspace showed the simplicity of the whole process and how quick Amazon Q can generate value for an organization.

6. Building a Strong Data Foundation for Generative AI

A central theme at the Summit was the importance of a solid data foundation for successful generative AI initiatives. AWS demonstrated how businesses can harness structured and unstructured data through various tools and services. Key components of this foundation include:

  • Data Storage: Managing structured and unstructured data using SQL, NoSQL, and graph databases
  • Data Analytics: Utilizing data lakes for search, streaming, and interactive analytics
  • Vector Embeddings: Tokenizing and storing data for semantic similarity searches
  • Data Integration: Combining data from different sources using tools like AWS Glue and Amazon DataZone.

7. Governance and Compliance in the Cloud

Governance and compliance were also significant topics, with AWS highlighting how organizations can manage data securely and efficiently. Enterprise customers look for democratized data tools with built-in governance to discover, understand, and access data across organizations, with the ability for multiple personas to collaborate on the same data problems. In addition, easy-to-use and easy-to-access analytics and BI tools are crucial for value creation. The Summit showcased services like AWS IAM, Amazon Cognito, AWS Lake Formation, and Amazon S3 for data management, access control, and auditing. These tools help ensure that cloud operations are compliant with regulations and best practices

8. The Future of Generative AI

Lastly, the Summit concluded with a discussion on the future of generative AI. The evolution of AI agents such as Ninjatech.AI, multimodal models, and new regulations were some of the topics that were discussed. The session also explored the balance between value and feasibility in AI projects. It is crucial to identify the value generated from productivity, experience, and revenue, but also focus on the need for innovation that is both effective and sustainable.

The AWS Summit New York 2024 highlighted the latest advancements in cloud technology and AI. One of the major releases, Amazon Bedrock, allows businesses to build, deploy, and scale AI-driven solutions without extensive expertise and effort. This promotes businesses to focus more on performance, cost, and ethical responsibilities with gen AI.

The Summit offered valuable insights and tools for businesses looking to leverage cloud computing for innovation and efficiency. Many case studies were showcased to further support the adoption of generative AI in businesses of all sectors and instances where generative AI can create value for all aspects of the business. The sense of urgency to adopt gen AI has doubled since last year, and the emphasis to build a solid data foundation for successful generative AI initiatives has never been greater. The many new innovations simplifies the process for businesses to leverage data to create and differentiate generative AI applications, and create new value for customers and the business. The phrase “Your data is the differentiator” should be remembered as businesses navigate through the AI journey. 

Overall, the AWS Summit provided a comprehensive look at how AWS is shaping the future of technology. With a strong emphasis on AI and machine learning advancements, security enhancements, and sustainability efforts, the future has never looked so bright for businesses, developers, and consumers. 

Be sure to check out our Take30 webinar this Thursday around the approach and features of Ironside’s AscentAI.

Artificial Intelligence, at its core, is a wide ranging tool that enables us to think differently on how to integrate information, analyze data, and use the resulting insights to improve decision making.

With the current shift to digitization (which has been accelerated by the pandemic), customer behavior has changed significantly, along with the expectation around accuracy of AI-based predictions. We all are used to “What to Watch Next” recommendations from streaming channels like Netflix or “Suggested Products” to buy from Amazon, but now with most businesses offering the expected “Wait Time” before your a haircut, or pickup time for the food you ordered online, it is critical to manage the queue to ensure timely service that begets customer satisfaction & retention.

Many organizations want to leverage AI but are unable to mainly due to following reasons

  • High cost & Time to Market
  • Complexity & Lack of expertise
  • Uncertainty of success of the AI outcomes

Because one of our goals is to help our customers benefit from the AI revolution, derive real business value,  improve their, all in a timely fashion we at Ironside have introduced a product-style approach to Data Science called AscentAI: 

As with any project, the first steps are to identify and prioritize the business use case, define the objective, and clearly specify the goals. Next, we offer a rapid viability assessment (RVA),  which ensures the model provides sufficient signal to justify productionizing the machine learning model in under three weeks.

The activities during RVA involve:

  • Data collection & preparation
    • The quality and quantity of the data dictates the accuracy of the model
    • Split the data into two distinct datasets for training and evaluation
  • Feature engineering
    • Identify & define features for the models
  • Model training & evaluation
    • Choose different models and identify the best one for the defined requirements
  • Make & validate predictions
    • Measure prediction accuracy against real data sets

RVA is a decision phase, where if the AI provides more noise than actual signal, then it would not provide value in developing production-ready machine learning models. This gate-based approach ensures the customer has clear visibility into expected outcomes, and can take an informed decision on either pursuing the current use case or moving on to the next one.

If the RVA provides meaningful insights, then part of the next step is to productionize the best model, integrating it into existing business processes to start consuming the predictions.

The final step includes a performance monitoring dashboard, which is provided to monitor the performance of the models, identify the need to tune, and optimize the model due to the naturally expected skew over time. Finally, we strongly recommend model “retraining” over time at a predefined frequency to ensure the AI consistently delivers on the expected ROI.

Below is a snapshot of a real implementation of AscentAI for a customer with 1800+ stores to predict accurate “wait times” in real time in a very high volume setting on AWS cloud platform.

Ironside’s Take30 with a Data Scientist series was typically targeted towards business leaders, with topics focused on strategy, including use-case development advice, de-risking AI with Data Science-as-a-Service, and ways to overcome common barriers to AI adoption. We also covered technical concepts like Model Evaluation and Feature Store Development. On top of that, we took several deep dives into technology partners including IBM Watson Auto AI, AWS Sagemaker Studio, Snowflake and DataRobot. Finally, we had a couple industry spotlights where we explored common use cases in Higher Education and Insurance.

Several attendees have shared that these sessions bridge the gap between the technical world of Machine Learning and that of their business, which in turn has helped them to know how to bridge that gap within their own organizations. For technicians, it has helped them to understand how to talk to the business and draw out use cases and help the business adopt solutions. For the business leaders, it’s helped them know what to ask of the data science team or what to look for in building a team. 

Overcoming the Most Common Barriers to AI Adoption (2/25/21)

Because so many organizations are in the early stages of AI Adoption, this is likely the most important topic to CIOs and business leaders in the Data Science series. This session discusses the challenges with people, infrastructure, and data that every organization faces and offers sound advice on how to overcome them.

Is Data Science-as-a-Service Right for your Organization? (5/19/20)

AscendAI, Ironside’s Data Science-as-a-Service, provides many benefits to organizations that are in the early or mid-stages of AI Adoption. Learn more about Ironside’s offering and how it could reduce your time to ROI to as little as 12 weeks.

How Snowflake Breaks the Chains Holding Your Data Science Team Back (9/10/20)

We hosted a number of Technology related secession with Partners such as Snowflake. This session dove a bit deeper than Data Science Best Practices: Feature Stores. Other Technology related sessions include Watson Studio, AWS Sagemaker, and a data enrichment session with Precisely, titled More Data, More Insight: The Value of Data Enrichment for Analytics.

Data Science work requires infrastructure that is scalable, cost-effective, and with easy access to multiple data sources. Snowflake provides this and much more to a data science tech stack. It also integrates easily with other machine learning platforms like DataRobot, AWS, and Azure. Snowflake is particularly valuable for data sharing with external data sources.

Leveraging Data for Predicting Outcomes in Higher Ed (6/30/20)


We hosted an industry-related session sharing how Higher Education is leveraging machine learning in very creative ways; this ended up being one of our top attended sessions for the Take30 series. In this webinar, we reviewed some of the ways that higher ed is using machine learning such as enrollment management, space planning and student retention. We also discussed some of the use cases that are helping universities cope with the challenges and nuance of COVID-19. We also hosted another industry specific session on Insurance.

______

As we continue our Take30 with a Data Scientist series, we’ll continue to partner with experts in Machine Learning technology to offer demos and successful solutions as well as strategic sessions for business leaders. We also hope to spotlight some of our clients this year and the exciting AI driven applications we are developing for them in Retail, Insurance, Higher Ed, and Manufacturing. Coming up on May 20th, we will be hosting an industry focus for Banking. 

We’d love to have 1-on-1 conversations to discuss any challenges you may be facing with AI adoption. Please feel free to sign up for a spot with Pam Askar, our Director of Data Science.

So you’re thinking that 2021 is the year to infuse Artificial Intelligence/Machine Learning (AI/ML) into your business. You’ve read about the difference it’s making in other organizations. You want to beat — or keep pace with — your competitors. But where should you begin? 

Should you license AI/ML software? How do you find the right business problem to solve? And if you’re like most organizations, your data is imperfect. Should you focus there first? 

Ironside can help. We’re a data and analytics consulting firm with a track record of helping companies get started with AI.

Let’s start with four things we think every organization should consider on their AI journey. They’re not the only four things you need to know, but we know your time is valuable, so let’s start here:

  1. Develop an AI Use Case Catalog – One of the first places you’ll start is to develop a list  of possible business problems, opportunities or challenges that AI might improve. We assist our customers in building that list by talking to executives and functional area leaders and understanding the organization’s strategic goals, how they are measured and then considering challenges/pain and what information is missing that would improve decision making. The catalog of use cases should be enhanced with information from a thorough data analysis. Is there the data to support the use case? Is there enough of it? What’s the data quality? Can you forecast improvement of an important metric? What’s the return on investment? 
  2. Involve a variety of stakeholders. Executive sponsorship in some form is critical in funding and executing an AI project. But building a culture of AI across an organization starts by involving as many stakeholders as possible across functional areas. Even if the use cases surfaced by some stakeholders are not immediately pursued, people want to be included and to have a voice. Broader involvement will avoid roadblocks and seed a culture of AI. Organizational success will grow over time. 
  3. Start small – Rank the use case catalog and find one or two to test. Identify the relevant business sponsor and data and prepare a limited set of data, or features. Build a simple machine learning algorithm to see if there are results that show that AI could improve a desired outcome. If not, move on to the next use case in the catalog (see, that’s why we need a catalog). If the early results are promising, move on to building a more advanced machine learning model.  
  4. Limit your investment – For as low a cost as possible, get a model deployed and start using it in the business to begin to get the benefit. You’ll inevitably iterate on that model but expediting that process and limiting the investment — and the risk — is the goal. Now here’s where we answer the questions about hiring a data scientist or buying software. 

Sometimes the answer is yes but for many organizations the answer is “no.” They’re just not sophisticated enough. And big costly failures could sour your organization on pursuing AI and set you back years

Ascend AI 

So what should you do? One option to get started is Ascend AI, a data science as-a-service solution that Ironside developed. Ascend AI lowers the risk out of diving into AI on your own. It is  underpinned by a custom configured and scripted cloud-based architecture as well as our highly skilled data scientists. 

We bring the data scientists and engineers and the technology. You provide the data and the business problems. 

We start with your leading use cases or help you develop them in a use case catalog. Then we perform rapid viability assessments on the leading use cases selected and if signs are good we would then build out full machine learning algorithms. Finally, we could deploy and host and manage the algorithms. At any point, depending on customer preference and maturity, we would hand the IP back to our customers and help them develop AI competency in house. We’re not a black box. 

Of course there’s more to getting started with AI than these four points and data science as a service might not always be the answer. The thing to remember is that AI should be consumed in bite sized-chunks and is attainable to even the most technologically immature organizations.

LEXINGTON, MA, May 10, 2019 – Ironside, an enterprise data and analytics firm, was featured in a Wall Street Journal article about AI consultants that enable their clients to be self-sufficient with AI and not have to rely on their consulting counterparts to manage the model.

Read more

LEXINGTON, MA, May 3, 2019 – Ironside, an enterprise data and analytics firm, was recognized in the Wall Street Journal this morning for AI work being done at one of our clients, Coverys, a Boston-based provider of medical professional liability insurance.

Read more

2019 is the year that data science, machine learning and artificial intelligence for business will become ubiquitous. Most organizations large and small, across all industries, have recognized the benefits and competitive advantage that these capabilities bring to bear. If you have not already begun the journey, chances are this will be the year you begin to develop this competency. Whether you’re about to take your first step, you’re a team of one looking to scale, or even a more mature organization that is always seeking self-improvement, consider the following traits to maximize your chances of success with data science.

Read more

At least weekly, I am granted the opportunity to meet and work alongside experienced professionals who serve in a corporate business intelligence (BI) leadership function. When they describe their role upon introduction, there is a common thread to the scope of influence and control which usually intersects one or more of these domains: Read more