Hugging Face

Gretel Releases World’s Largest Open Source Text-to-SQL Dataset to Accelerate AI Model Training

Retrieved on: 
Thursday, April 4, 2024

SAN FRANCISCO, April 04, 2024 (GLOBE NEWSWIRE) -- Gretel , the leader in synthetic data, today released the world’s largest open source Text-to-SQL dataset to unlock new possibilities for AI in the enterprise.

Key Points: 
  • SAN FRANCISCO, April 04, 2024 (GLOBE NEWSWIRE) -- Gretel , the leader in synthetic data, today released the world’s largest open source Text-to-SQL dataset to unlock new possibilities for AI in the enterprise.
  • "Access to quality training data is one of the biggest obstacles to building with generative AI.
  • Everything Gretel does is designed to address this issue head-on, and contributing to the open-source community is no exception," said Alex Watson, co-founder & Chief Product Officer at Gretel.
  • We’re excited for developers to take our dataset for a spin, and build upon it.”
    The largest AI companies in the world are struggling with access to high-quality training data.

Cloudflare Powers One-Click-Simple Global Deployment for AI Applications with Hugging Face

Retrieved on: 
Tuesday, April 2, 2024

Cloudflare, Inc. (NYSE: NET), the leading connectivity cloud company, today announced that developers can now deploy AI applications on Cloudflare’s global network in one simple click directly from Hugging Face, the leading open and collaborative platform for AI builders.

Key Points: 
  • Cloudflare, Inc. (NYSE: NET), the leading connectivity cloud company, today announced that developers can now deploy AI applications on Cloudflare’s global network in one simple click directly from Hugging Face, the leading open and collaborative platform for AI builders.
  • “The recent generative AI boom has companies across industries investing massive amounts of time and money into AI.
  • Workers AI is also expanding to support fine-tuned model weights, enabling organizations to build and deploy more specialized, domain-specific applications.
  • “We are excited to work with Cloudflare to make AI more accessible to developers,” said Julien Chaumond, co-founder and chief technology officer, Hugging Face.

Domo Introduces Groundbreaking AI Features to Expand Domo.AI Framework and Bring the Transformative Power of AI to All

Retrieved on: 
Thursday, March 28, 2024

Today Domo (Nasdaq: DOMO) announced at Domopalooza: the AI + Data Conference a new wave of features for Domo.AI, the company’s framework of flexible artificial intelligence (AI) services, powered by its award-winning platform.

Key Points: 
  • Today Domo (Nasdaq: DOMO) announced at Domopalooza: the AI + Data Conference a new wave of features for Domo.AI, the company’s framework of flexible artificial intelligence (AI) services, powered by its award-winning platform.
  • New features announced today help ensure each organization’s unique AI needs are fulfilled, through broader access to AI models offered across the industry and by bringing AI technology to life simply and securely within Domo.
  • “By creating that type of open framework for AI + Data, we are giving users a pathway to immediate success, without going to the extraordinary lengths they may otherwise have to.
  • AI Chat will also enable Domo users to build better, faster dashboards and generate insights in a conversational way for smarter, faster decision making.

DataStax and Microsoft Collaborate to Make it Easier to Build Enterprise Generative AI and RAG Applications with Legacy Data

Retrieved on: 
Tuesday, March 26, 2024

DataStax , the generative AI data company, today announced a milestone in its journey to simplify enterprise retrieval-augmented generation (RAG) for developers by integrating with Microsoft Semantic Kernel .

Key Points: 
  • DataStax , the generative AI data company, today announced a milestone in its journey to simplify enterprise retrieval-augmented generation (RAG) for developers by integrating with Microsoft Semantic Kernel .
  • This integration enables developers to more easily build RAG applications and vectorize data with Astra DB and Microsoft's ecosystem of AI products and copilots using Semantic Kernel’s open source SDK for AI applications and agents.
  • Developers are seeking solutions to streamline the development of more powerful RAG (retrieval augmented generation) applications and AI agents.
  • Key features of Semantic Kernel include semantic functions, chaining capabilities, planners, and connectors for various enterprise applications and data sources.

ServiceNow Furthers Generative AI Leadership With New Capabilities in the Washington, D.C. Platform Release

Retrieved on: 
Wednesday, March 20, 2024

ServiceNow (NYSE: NOW), the leading digital workflow company making the world work better for everyone, today furthered its generative AI (GenAI) leadership with new capabilities in its Washington, D.C. platform release.

Key Points: 
  • ServiceNow (NYSE: NOW), the leading digital workflow company making the world work better for everyone, today furthered its generative AI (GenAI) leadership with new capabilities in its Washington, D.C. platform release.
  • “ServiceNow leads the industry with secure, responsible generative AI solutions, all on a single platform for end-to-end business transformation.
  • The new ServiceNow Impact AI Accelerators allow platform owners to adopt ServiceNow generative AI experiences quickly and easily, map investments to business objectives, and track the value they’ve gained from generative AI for faster ROI.
  • “Through ServiceNow Impact and the generative AI Accelerator, we can now get a preview of generative AI in action, tailored to our business.

ServiceNow Advances Enterprise-Grade Generative AI Through Expanded Partnership With NVIDIA

Retrieved on: 
Monday, March 18, 2024

NVIDIA GTC — ServiceNow (NYSE: NOW), the leading digital workflow company making the world work better for everyone, today announced an expansion of its partnership with NVIDIA to advance the use of enterprise-grade generative AI (GenAI).

Key Points: 
  • NVIDIA GTC — ServiceNow (NYSE: NOW), the leading digital workflow company making the world work better for everyone, today announced an expansion of its partnership with NVIDIA to advance the use of enterprise-grade generative AI (GenAI).
  • ServiceNow is using NIM to serve its Now LLMs – domain specific LLMs that power capabilities within Now Assist, ServiceNow’s generative AI experience.
  • “ServiceNow and NVIDIA are building a future where businesses can break through every barrier,” said ServiceNow Chairman and CEO Bill McDermott.
  • “Together, NVIDIA and ServiceNow are helping enterprises everywhere embrace generative AI within the platforms they use to serve customers, manage employees, enhance their operations, and transform their industries.”
    NVIDIA and ServiceNow announced their initial partnership to develop powerful enterprise-grade generative AI capabilities in May 2023.

Cerebras and G42 Break Ground on Condor Galaxy 3, an 8 exaFLOPs AI Supercomputer

Retrieved on: 
Wednesday, March 13, 2024

Cerebras Systems , the pioneer in accelerating generative AI, and G42 , the Abu Dhabi-based leading technology holding group, today announced the build of Condor Galaxy 3 (CG-3), the third cluster of their constellation of AI supercomputers, the Condor Galaxy.

Key Points: 
  • Cerebras Systems , the pioneer in accelerating generative AI, and G42 , the Abu Dhabi-based leading technology holding group, today announced the build of Condor Galaxy 3 (CG-3), the third cluster of their constellation of AI supercomputers, the Condor Galaxy.
  • The Cerebras and G42 strategic partnership already delivered 8 exaFLOPs of AI supercomputing performance via Condor Galaxy 1 and Condor Galaxy 2, each amongst the largest AI supercomputers in the world.
  • Located in Dallas, Texas, Condor Galaxy 3 brings the current total of the Condor Galaxy network to 16 exaFLOPs.
  • By doubling the capacity to 16 exaFLOPs, we look forward to seeing the next wave of innovation Condor Galaxy supercomputers can enable.”
    At the heart of Condor Galaxy 3 are 64 Cerebras CS-3 Systems.

Databricks Launches DBRX, A New Standard for Efficient Open Source Models

Retrieved on: 
Wednesday, March 27, 2024

SAN FRANCISCO, March 27, 2024 /PRNewswire/ -- Databricks, the Data and AI company, today announced the launch of DBRX, a general purpose large language model (LLM) that outperforms all established open source models on standard benchmarks. DBRX democratizes the training and tuning of custom, high-performing LLMs for every enterprise so they no longer need to rely on a small handful of closed models. Available today, DBRX enables organizations around the world to cost-effectively build, train, and serve their own custom LLMs.

Key Points: 
  • "We're excited about DBRX for three key reasons: first, it beats open source models on state-of-the-art industry benchmarks.
  • Second, it beats GPT-3.5 on most benchmarks, which should accelerate the trend we're seeing across our customer base as organizations replace proprietary models with open source models.
  • DBRX sets a new standard for open source models, enabling customizable and transparent generative AI for all enterprises.
  • A recent survey from Andreessen Horowitz found that nearly 60 percent of AI leaders are interested in increasing open source usage or switching when fine-tuned open source models roughly match performance of closed source models.

Databricks Launches DBRX, A New Standard for Efficient Open Source Models

Retrieved on: 
Wednesday, March 27, 2024

SAN FRANCISCO, March 27, 2024 /PRNewswire/ -- Databricks, the Data and AI company, today announced the launch of DBRX, a general purpose large language model (LLM) that outperforms all established open source models on standard benchmarks. DBRX democratizes the training and tuning of custom, high-performing LLMs for every enterprise so they no longer need to rely on a small handful of closed models. Available today, DBRX enables organizations around the world to cost-effectively build, train, and serve their own custom LLMs.

Key Points: 
  • "We're excited about DBRX for three key reasons: first, it beats open source models on state-of-the-art industry benchmarks.
  • Second, it beats GPT-3.5 on most benchmarks, which should accelerate the trend we're seeing across our customer base as organizations replace proprietary models with open source models.
  • DBRX sets a new standard for open source models, enabling customizable and transparent generative AI for all enterprises.
  • A recent survey from Andreessen Horowitz found that nearly 60 percent of AI leaders are interested in increasing open source usage or switching when fine-tuned open source models roughly match performance of closed source models.

Ngram Releases Groundbreaking AI Dataset to Revolutionize Medical Information Access for Healthcare Professionals

Retrieved on: 
Friday, March 22, 2024

SAN FRANCISCO, March 22, 2024 /PRNewswire-PRWeb/ -- Ngram, a San Francisco-based startup specializing in generative AI solutions for the life sciences industry, today announced the release of its innovative dataset, medchat-qa, on Hugging Face. This dataset comprises a blend of real-world and synthetic questions healthcare providers (HCPs) frequently ask pharmaceutical companies about their drugs. It covers critical topics such as dosage, adverse reactions, drug interactions, new indications, and off-label uses. The release of medchat-qa marks a significant step in Ngram's mission to simplify the process for Medical Affairs teams to swiftly access literature and deliver quick, precise responses to HCPs.

Key Points: 
  • Traditionally, doctors, nurses, pharmacists, and other HCPs seeking information about a drug must reach out to the pharmaceutical company's medical information department.
  • "HCPs require immediate access to accurate and current information to make informed treatment decisions," said Anish Muppalaneni, CEO and co-founder of Ngram.
  • "The medchat-qa dataset is a vital step toward fulfilling this objective and transforming how Medical Affairs teams deliver and access information for healthcare professionals."
  • The open-source release of the medchat-qa dataset on Hugging Face signifies a substantial progress in medical information access, highlighting Ngram's dedication to enhancing the healthcare experience for professionals and patients alike.