When Salesforce is life!

Tag: AI

Understanding the Salesforce AI Coding Assistant: CodeGen LLM

Gilad David Maayan is a technology writer who has worked with over 150 technology companies including SAP, Imperva, Samsung NEXT, NetApp and Check Point, producing technical and thought leadership content that elucidates technical solutions for developers and IT leadership. Today he heads Agile SEO, the leading marketing agency in the technology industry.


What Is CodeGen LLM? 

CodeGen LLM, or code generation language model, is an open source AI system created by Salesforce, which assists with coding tasks. It leverages a neural network architecture to parse and generate code across various programming languages. CodeGen automates repetitive tasks and improves coding efficiency, allowing developers to focus more on creative problem-solving tasks. Enabled by a large corpus of code data, CodeGen can predict and suggest code snippets, enhancing the software development process.

The primary role of CodeGen is to reduce time spent on mundane code-writing steps, thus accelerating the development cycle. Its integration into development environments allows for transitions between human and machine-generated code, offering productivity gains. 

CodeGen Versions 

Since its initial release, CodeGen has gone through several iterations, each enhancing its capabilities and performance.

  • CodeGen 1.0: Launched in early 2022, this was the first major version of Salesforce’s open-source LLM for code generation. It featured up to 16 billion parameters, making it one of the largest open-source models at the time. CodeGen 1.0 established a foundation for generating and understanding code across various programming languages.
  • CodeGen 2.0: Released in early 2023, this version introduced improvements in the quality of code generation. It became a practical tool for developers, saving them around 90 minutes per day by automating routine coding tasks. With the release of CodeGen 2.0, it started to be used internally at Salesforce for AI-powered development workflows.
  • CodeGen 2.5: Released in July 2023, CodeGen 2.5 was optimized for production environments, offering lower latency and better overall performance. It was trained on a massive dataset, StarCoderData, containing 783GB of code from 86 programming languages. With over 600,000 monthly downloads, CodeGen 2.5 has become widely adopted.

CodeGen Architecture and Components [QG3]

CodeGen is built on a transformer-based architecture, which uses self-attention mechanisms to handle both programming and natural language tasks. At its core, it combines an encoder-decoder structure, specifically optimized for code generation. The architecture relies on a prefix-based model, known as a Prefix-LM, to unify the strengths of both bi-directional and uni-directional attention mechanisms. This design allows CodeGen to handle both code synthesis and understanding tasks by enabling bi-directional attention for understanding contexts and uni-directional attention for auto-regressive code generation.

The model is trained using a mix of causal language modeling and span corruption, ensuring information transfer across various tasks. Span corruption allows the model to recover missing sections of code, making it useful for code completion tasks. CodeGen also incorporates infill sampling, enabling the model to fill in missing code between two known sections, improving its flexibility in generating structured and coherent code.

Additionally, the training data for CodeGen includes a mixture of programming languages and natural language, which enhances its versatility. The mixture of these datasets helps CodeGen excel in multi-modal environments, supporting diverse programming needs while maintaining strong performance in natural language processing​.

CodeGen Use Cases 

CodeGen LLM serves a variety of practical purposes within software development, enabling automation and enhancing productivity for developers. One key use case is code completion. CodeGen is trained to predict the next sequences of code based on existing patterns, making it invaluable for completing partially written code. This functionality reduces the time developers spend on tasks like closing brackets, writing function endings, or repeating known structures.

Another prominent use case is code synthesis. CodeGen can generate new code snippets based on high-level descriptions or function names. This capability aids in rapidly creating boilerplate code, such as class definitions, import statements, or repetitive logic.

In addition to these capabilities, code refactoring is another area where CodeGen excels. By analyzing and understanding existing code, it can suggest optimizations, enforce coding standards, and identify areas that can be improved. This reduces the likelihood of errors and improves the quality of the codebase over time.

Finally, CodeGen supports multilingual coding environments, allowing it to switch between different programming languages as needed. This versatility makes it suitable for projects that involve multiple languages, enhancing collaboration across teams and minimizing the friction of switching between syntax rules​.

Notable CodeGen Alternatives 

CodeGen LLM is a newcomer to the AI coding assistant arena, and there are several established alternatives. Here are a few tools you might consider as an alternative to the Salesforce offering.

Tabnine

Tabnine’s AI coding assistant is an AI-powered code assistant that automates repetitive tasks and improves code generation efficiency.

Key features of Tabnine include:

  • Autogenerated code: Generates high-quality code and converts plain text into code, reducing the time spent on repetitive tasks.
  • AI chat for development: Provides AI-driven assistance throughout the software development lifecycle, from code creation and testing to documentation and bug fixing.
  • Context-aware suggestions: Offers personalized code suggestions based on the developer’s code patterns and usage history.
  • Wide language and IDE support: Compatible with popular programming languages, libraries, and integrated development environments (IDEs).
  • Customizable AI models: Allows developers to create models specifically trained on their own codebase for more tailored assistance.

GitHub Copilot

GitHub Copilot is an AI-powered coding assistant that enhances developer workflows by providing real-time code suggestions and improving code quality.

Key features of GitHub Copilot include:

  • AI-based code suggestions: Offers real-time code completions and suggestions as developers type, based on the context of the project and style conventions.
  • Natural language to code: Translates natural language prompts into functional code, allowing developers to build features and fix bugs more efficiently.
  • Improved code quality: Enhances code quality with built-in vulnerability prevention, blocking insecure coding patterns and ensuring safer code.
  • Collaboration-enhancing: Acts as a virtual team member, answering questions about the codebase, explaining complex code snippets, and offering suggestions for improving legacy code.
  • Personalized documentation: Provides tailored documentation with inline citations.

Amazon Q Developer

Amazon Q Developer is a generative AI-powered assistant built to streamline software development tasks and optimize AWS resource management.

Key features of Amazon Q Developer include:

  • Real-time code suggestions: Provides instant code completions, from simple snippets to full functions, based on your comments and existing code. It also supports command-line interface (CLI) completions and natural language translations to bash.
  • Autonomous agents for software development: Automates multi-step tasks like feature implementation, code documentation, and project bootstrapping, all initiated from a single prompt.
  • Legacy code modernization: Facilitates quick upgrades for legacy Java applications, with transformations from Java 8 to Java 17, and upcoming support for cross-platform .NET transformations.
  • Custom code recommendations: Integrates securely with private repositories to generate highly relevant code suggestions and help developers understand internal codebases more effectively.
  • Infrastructure management via chat: Assists with AWS resource management, from diagnosing errors and fixing network issues to recommending optimal instances for various tasks, all through simple natural language prompts.

Replit AI

Replit AI is an AI-powered coding assistant designed to collaborate with developers in building software efficiently.

Key features of Replit AI include:

  • Context-aware assistance: Provides personalized suggestions based on the entire codebase, offering help with debugging, generating test cases, writing documentation, and setting up API integrations.
  • Collaborative AI chat: Enables teamwork by allowing developers to collaborate in real-time using AI chat to solve coding challenges and implement features together.
  • Code understanding: Helps developers navigate unfamiliar codebases, frameworks, APIs, and languages by providing explanations and clarifying complex sections of code.
  • Natural language code generation: Converts natural language prompts into working code, simplifying tasks like making design changes or debugging.
  • Automated code completion: Offers auto-complete suggestions and runtime debugging to help automate repetitive coding tasks, speeding up the development process.

Conclusion 

The landscape of AI-powered coding tools is vast and continually evolving, with CodeGen and its alternatives playing critical roles in transforming how development tasks are approached. Each tool offers strengths, catering to various aspects of developer productivity and project demands. Understanding these tools’ capabilities and limitations is crucial for developers intending to integrate AI into their workflows.

Choosing between tools like CodeGen and its alternatives depends largely on the specific needs of a development team or project. While some tools excel in cloud integrations, others might be better suited for collaborative coding environments. A thorough understanding of project goals, infrastructure, and development processes can guide informed decisions regarding the adoption of an AI code generation tool.

Automating Data Migration with AI


One of the biggest challenges businesses face is managing massive amounts of data cost-effectively. This is where data migration comes in. Data migration is a concept for transferring data from one system to another, but handling such things is complex and sometimes difficult. 

The advent of artificial intelligence (AI) integration, specifically within customer relationship management (CRM) platforms such as Salesforce, has significantly changed this process. 

AI algorithms can analyze big data quickly and provide reliable and easy-to-use data migration. In this article, you will learn about the importance and challenges of data migration and how automating data migration with AI is connected to Salesforce.

Overview of Data Migration

Salesforce data migration involves transferring data from one system to another within the platform. This is essential for any new Salesforce organization, upgrading an existing Salesforce instance, or integrating Salesforce with other systems. Incorporating AI into Salesforce data migration offers numerous benefits.

Businesses can handle larger data volumes more efficiently, leading to faster implementation and quicker value realization from their Salesforce investments

Data is only meaningful when properly managed and utilized to help organizations make effective decisions and manage their business operations. A successful data migration ensures that all necessary data is accurately transferred, maintaining data integrity and minimizing business downtime. 

Without technology like MuleSoft, uploading data into Salesforce requires significant manual effort. Human errors such as data duplication or incomplete data migration are common in manual processes. These errors can lead to data discrepancies, resulting in poor decision-making, decreased trust in the system, and reduced overall reliability.

Role of AI in Data Migration

Productivity, decision-making, and operations, amongst many other enrichments, have been done by devising Artificial Intelligence (AI) in different sectors. One important area where AI is highly effective in addressing is data migration—migration or transfer of data between two systems or platforms. 

AI can bring colossal benefits to automating data migration. AI-based algorithms can effectively process the structure of bookkeeping records, finding patterns and correlations that must be considered when migrating data. 

This ensures that no crucial information falls through the cracks or is misplaced, resulting in a smoother migration with the least impact on business operations. It also allows the automation of many repetitive and time-consuming migration tasks, such as data mapping, transformation, and validation, which reduces manual effort and saves time and resources. 

Furthermore, it can help solve common data issues, such as identifying inconsistent or duplicate data entries, and automate the process of data cleaning and integrity checks. This provides the exact and consistent migrated data, which enhances decision-making and makes it favorable for analysis.

Best Practices for Automating Data Migration

With the advancements in artificial intelligence (AI), companies can now leverage this technology to automate their data migration tasks. However, following best practices specifically designed for automating data migration with AI is crucial to ensure a successful and seamless migration. 

Planning and Assessment

Planning and assessment are key to any successful migration. This includes understanding the data at the moment, setting goals, and determining key performance indicators (KPIs) for success. Analyze the infrastructure, applications, and databases and their suitability in the new environment, and thoroughly evaluate the existing data and systems. 

This assists in defining the steps and resources the migration will need for a successful transition. Having clear objectives and KPIs is equally crucial. Migration success thus can be ascertained against the measurable and achievable outcomes that the organizations have in mind, typically either better performance and/or some cost savings. 

KPIs, such as downtime reduction and data accuracy, establish how to measure effectiveness. Careful evaluation and planning help reduce risks, determine appropriate resources, make necessary changes, and create a solid foundation for a smooth and successful migration, leading to a better-performing system with better-suited provisions.

Data Quality Management

Ensuring data is accurate, complete, and consistent is crucial. Before migrating data, verify its integrity, ensure it fits the required format, and validate it against predefined rules. Ensure data completeness and resolve inconsistencies like duplicate data or conflicting values. 

AI can help identify and correct anomalies, detect outliers, and automatically fix errors, continuously improving data accuracy and reliability. They can enable better decision-making, improved operational efficiency, and enhanced customer experiences. 

Mapping and Transformation

Critical steps include mapping and transforming data from a source to a target system. Create a data map to define the relationship between source and target fields. This map acts as a blueprint, ensuring accurate data transfer. 

Traditionally, these tasks were manual and time-consuming, but AI can automate and optimize these processes. AI algorithms intelligently analyze and map data, speeding up the process and reducing errors.

Organizations can use AI for intelligent data mapping and transformation to enhance their data integration capabilities and improve overall data quality. Automated mapping and transformation processes enable faster and more accurate data transfers, saving time and resources.

Automation Strategy

Developing an automation strategy involves planning to automate repetitive tasks using AI-driven tools. This boosts efficiency, reduces errors, and frees up resources for strategic activities. 

Identify tasks suitable for automation, select appropriate AI tools, and ensure seamless integration with existing systems and workflows. Provide employee training and support and regularly monitor and evaluate the impact on efficiency, accuracy, and resource allocation.

Seamless integration with existing systems and workflows requires careful planning and coordination. Training and support for employees ensure smooth transitions. Regular monitoring and evaluation with defined KPIs help measure the impact on efficiency, accuracy, and resource allocation, allowing for necessary adjustments to enhance the strategy.

Testing and Validation

Testing and validation are essential to ensure a smooth transition. Conduct pre-migration tests to identify potential issues with the source data, such as missing or incomplete records, data inconsistencies, or formatting errors. 

Use AI for automated testing and validation. Quickly compare source and target data to ensure integrity and successful migration. AI can also identify anomalies or discrepancies, saving time and reducing human error.

Monitoring and Reporting

Monitoring and reporting are crucial for tracking migration progress. Set up robust monitoring systems to collect data on various parameters. Use AI for real-time reporting and anomaly detection, providing valuable insights for timely decision-making. 

Regular reports should highlight important metrics and trends, informing stakeholders about progress and challenges. Additionally, monitoring systems should be designed to provide regular reports that highlight important metrics and trends. 

These reports can be shared with stakeholders, including government agencies, non-governmental organizations, and other relevant entities, to inform them about the migration’s progress and any challenges encountered.

Tools for Automating Data Migration in Salesforce

Data migration is a critical process in Salesforce that involves transferring data from one system to another. Automating this process can help businesses save time, reduce errors, and ensure data integrity. 

Salesforce CRM Analytics

Salesforce CRM Analytics offers several features and capabilities for data migration, making it a powerful tool for businesses. One of the key features is the ability to integrate data from multiple sources into the Salesforce platform seamlessly. This tool offers several key features:

  • Data Integration: Seamlessly brings data from various sources into Salesforce, consolidating it into a central location for easier access and analysis.
  • Advanced Mapping Capabilities: Automatically map data fields from different sources to corresponding fields in Salesforce, ensuring accurate and efficient data transfer.
  • Data Transformation: Perform data cleansing and normalization to ensure data integrity and consistency.
  • AI-Powered Analytics: Utilize AI algorithms to detect patterns and relationships within the data, providing insights and trends that might be overlooked. This includes predictive analytics for data-driven forecasts and projections.
  • Data Quality Management: Automatically identify and flag potential data inconsistencies or errors during migration, maintaining clean and accurate data within Salesforce.

Mulesoft Anypoint Platform

Mulesoft’s Anypoint Platform is a comprehensive integration platform that connects applications, data, and devices across an entire ecosystem. 

Key features include:

  • API Management: Create, design, and manage APIs to expose data and services to external developers and partners, unlocking the value of existing systems and data.
  • AI for Data Migration: Leverage AI and machine learning algorithms to understand and map data structures of different systems, facilitating faster and error-free data migration.
  • Wide Range of Connectors: Access a variety of connectors and pre-built integration templates to easily link and integrate applications, data sources, and IoT devices, enhancing flexibility and scalability.
  • Automation: Reduce manual data mapping and disruptions during migration by automating the data transfer process.

Informatica Intelligent Cloud Services (IICS)

Informatica Intelligent Cloud Services (IICS) offers a robust data integration and management platform with AI-powered tools. Key features include:

  • Seamless Salesforce Integration: Use pre-built connectors and templates specifically designed for Salesforce, making data migration straightforward.
  • AI-Driven Automation: Automate tasks like data mapping to ensure accurate data migration and minimize errors. Machine learning algorithms enhance the migration process by identifying and resolving potential issues.
  • Data Quality and Cleansing: Ensure the integrity and relevance of Salesforce data through advanced data quality management and cleansing capabilities.

Talend Data Fabric

Talend Data Fabric is an innovative and comprehensive data integration and management solution offered by Talend. It provides tools and features specifically designed to streamline data migration. Talend Data Fabric is an all-encompassing data integration and management solution offering:

  • Data Integration: Support for various data sources, including databases, flat files, and cloud-based applications, ensuring smooth data extraction and integration.
  • AI-Driven Transformation: Automate complex data transformation tasks using AI, reducing manual effort and minimizing the risk of errors.
  • Data Quality Management: Utilize AI to detect and resolve data quality issues such as duplicates, missing values, and inconsistencies, ensuring clean, reliable, and error-free data migration.

Final Word

Integrating artificial intelligence into data migration processes, particularly within the Salesforce platform, has significantly revolutionized how businesses handle data transfers. AI-driven tools and algorithms automate and enhance critical tasks such as data mapping, transformation, and validation, reducing errors, minimizing disruptions, and improving data quality. 

Organizations can achieve seamless, efficient, and accurate data migrations by adopting best practices and leveraging advanced tools like Salesforce CRM Analytics, Mulesoft Anypoint Platform, Informatica Intelligent Cloud Services, and Talend Data Fabric. 

This transformation accelerates implementation, enhances operational efficiency, and strengthens decision-making capabilities.

🇮🇹 Salesforce Sidekicks EPISODE 1: Salesforce e l’AI

ℹ️ Di cosa si tratta? / What’s this all about? Salesforce Sidekicks

Se anche tu usi ChatGPT per creare slide fuffose, questo è l’episodio per te!

In più ci chiederemo come impatta l’AI sull’ecosistema Salesforce e faremo un po’ di chiarezza sui numeri, le challenges e le opportunità.

Salesforce Sidekicks EPISODE 1: Salesforce e l’AI (Spotify)

Link ai contenuti:

[Salesforce / Einstein] Playing around with apples and Einstein Prediction APIs

The machines are not going to rule humanity…for now.

So don’t be afraid of AI in your daily job as Awesome Developer / Admin / Adminloper.

A new revolution has come in the CRM world and Salesforce leads it as usual.

Einstein is AI brought to our beloved CRM platform, in may ways: enriches your sales decisions, marketing strategies, smartifies your communities and your social behavior.

I know what you are thinking, how can a humble Salesforce developer empower Artificial Intelligence?

Again, afraid be not!

Salesforce conveys a set of APIs for image recognition or text analysis, so you can integrate the power of AI into your application, whether inside Salesforce or not.

What can you do with Einstein APIs?

At the time of writing, you can:

  • Classify images
  • Detect number, size and position of objects inside images
  • Classify sentiment in text
  • Categorize unstructured text into user-defined labels

Read the complete documentation at metamind.readme.io.

In this post I’ll cover an example of how to classify images using Einstein Vision.

Use Case

Can you guess a business use case for this API?

A particulas piece of my fridge just broke down and it is difficult to explain by words which part should be replaced.

Just take a picture of the part and submit to the Einstein Vision engine (properly trained): the backoffice user may now be able to tell the “replacemente department” which part should be sent to the customer.

Another example, my hoven is not working and I don’t remember the model: take a pic, send to Einstein engine, the system can guess the model and execute the proper actions.

In our example we’ll just try to classify apples, not a cool business use case but it effectively shows how the library works.

First configurations

First thing to do is registering for the free Einstein Vision tier.

Go to https://api.einstein.ai/signup, register with your Developer ORG edition (use the Salesforce flow) and then download and save the provided key in the einstein_platform.pem file.

Go to your ORG and create a new Static Resource for this certificate and call it Einstein_Platform: this will be used to generate a JWT OAuth token every time it is needed.

Now create a new Remote Site Setting adding the https://api.metamind.io endpoint (this is the Einstein Vision API endpoint).

Now you are ready to use the provided package.

Before starting you should install the following Apex packages into your ORG (they are open source Einstein Vision wrappers):

Download and install into your ORG the following code from REPO: it’s just 2 pages and 2 classes.

Before starting be sure to change your Einstein APi email address in the EinsteinVisionDemoController:

public static String getAccessToken(){
    String keyContents = [Select Body From StaticResource Where Name = 'Einstein_Platform' limit 1].Body.toString();
    keyContents = keyContents.replace('-----BEGIN RSA PRIVATE KEY-----', '');
    keyContents = keyContents.replace('-----END RSA PRIVATE KEY-----', '');
    keyContents = keyContents.replace('n', '');

    // Get a new token
    JWT jwt = new JWT('RS256');
    jwt.pkcs8 = keyContents;
    jwt.iss = 'developer.force.com';
    jwt.sub = '[email protected]';
    jwt.aud = 'https://api.metamind.io/v1/oauth2/token';
    jwt.exp = '3600';
    String access_token = JWTBearerFlow.getAccessToken('https://api.metamind.io/v1/oauth2/token', jwt);
    return access_token;    
}

Configure the Dataset

This repo has a configuration page (for model training) and a prediction page (see a live demo here ).

Let’s open the administration page named EinsteinVisionDemoAdmin.

In the Dataset URL input copy the following dataset URL: https://raw.githubusercontent.com/enreeco/sf-einstein-vision-prediction-demo/master/dataset/mele.zip.

This ZIP file contains 6 folders: each folder represent a kind of apple (the folder name is the corresponding name) and it contains a list of 40/50 images of that kind of apple (I’m not an expert of apples, so some pictures may not be correct!).

Now press the Create Model Async button: there are 2 kinds of API for this porporuse, one is sync (and accepts zip files of up to 5 MB) and the other one is async (and accepts size of more than 5 MB).

This means that in this example we’ll be using only the async API: the request is taken in charge:

DATASET:

{
  "updatedAt" : "2017-07-11T14:17:33.000Z",
  "totalLabels" : null,
  "totalExamples" : 0,
  "statusMsg" : "UPLOADING",
  "name" : "mele",
  "labelSummary" : {
    "labels" : [ ]
  },
  "id" : 1006545,
  "createdAt" : "2017-07-11T14:17:33.000Z",
  "available" : false
}

Now you can press the button labelled Get All Datasets and Models to watch the upload operation complete:

Datasets: 1

{
  "updatedAt" : "2017-07-11T14:17:37.000Z",
  "totalLabels" : 6,
  "totalExamples" : 266,
  "statusMsg" : "SUCCEEDED",
  "name" : "mele",
  "labelSummary" : {
    "labels" : [ {
      "numExamples" : 38,
      "name" : "red_delicious",
      "id" : 52011,
      "datasetId" : 1006545
    }, {
      "numExamples" : 44,
      "name" : "granny_smith",
      "id" : 52012,
      "datasetId" : 1006545
    }, {
      "numExamples" : 45,
      "name" : "royal_gala",
      "id" : 52013,
      "datasetId" : 1006545
    }, {
      "numExamples" : 42,
      "name" : "golden",
      "id" : 52014,
      "datasetId" : 1006545
    }, {
      "numExamples" : 53,
      "name" : "renetta",
      "id" : 52015,
      "datasetId" : 1006545
    }, {
      "numExamples" : 44,
      "name" : "fuji",
      "id" : 52016,
      "datasetId" : 1006545
    } ]
  },
  "id" : 1006545,
  "createdAt" : "2017-07-11T14:17:33.000Z",
  "available" : true
}

Now we can train our model by copying the dataset id into the Dataset ID input box and pressing the Train Model button: Einstein analyzes the images with its deep learning algorithm to allow prediction.

MODEL:

{
  "updatedAt" : "2017-07-11T14:21:05.000Z",
  "trainStats" : null,
  "trainParams" : null,
  "status" : "QUEUED",
  "queuePosition" : 1,
  "progress" : 0.0,
  "name" : "My Model 2017-07-11 00:00:00",
  "modelType" : "image",
  "modelId" : "UOHHRLYEH2NGBPRAS64JQLPCNI",
  "learningRate" : 0.01,
  "failureMsg" : null,
  "epochs" : 3,
  "datasetVersionId" : 0,
  "datasetId" : 1006545,
  "createdAt" : "2017-07-11T14:21:05.000Z"
}

The process is asynchronous and takes some time to complete (it depends on the parameters passed to the train API, see code).

Press the Get All Datasets and Models button to see the process ending:

Datasets: 1

{
  "updatedAt" : "2017-07-11T14:17:37.000Z",
  "totalLabels" : 6,
  "totalExamples" : 266,
  "statusMsg" : "SUCCEEDED",
  "name" : "mele",
  "labelSummary" : {
    "labels" : [ {
      "numExamples" : 38,
      "name" : "red_delicious",
      "id" : 52011,
      "datasetId" : 1006545
    }, {
      "numExamples" : 44,
      "name" : "granny_smith",
      "id" : 52012,
      "datasetId" : 1006545
    }, {
      "numExamples" : 45,
      "name" : "royal_gala",
      "id" : 52013,
      "datasetId" : 1006545
    }, {
      "numExamples" : 42,
      "name" : "golden",
      "id" : 52014,
      "datasetId" : 1006545
    }, {
      "numExamples" : 53,
      "name" : "renetta",
      "id" : 52015,
      "datasetId" : 1006545
    }, {
      "numExamples" : 44,
      "name" : "fuji",
      "id" : 52016,
      "datasetId" : 1006545
    } ]
  },
  "id" : 1006545,
  "createdAt" : "2017-07-11T14:17:33.000Z",
  "available" : true
}

{
  "updatedAt" : "2017-07-11T14:22:33.000Z",
  "trainStats" : null,
  "trainParams" : null,
  "status" : "SUCCEEDED",
  "queuePosition" : null,
  "progress" : 1.0,
  "name" : "My Model 2017-07-11 00:00:00",
  "modelType" : "image",
  "modelId" : "UOHHRLYEH2NGBPRAS64JQLPCNI",
  "learningRate" : null,
  "failureMsg" : null,
  "epochs" : null,
  "datasetVersionId" : 3796,
  "datasetId" : 1006545,
  "createdAt" : "2017-07-11T14:21:05.000Z"
}

We are almost ready!

Predict!

The only thing you have to do is to open the EinsteinVisionDemo demo passing the above Model Id (e.g. /apex/EinsteinVisionDemo?model=UOHHRLYEH2NGBPRAS64JQLPCNI):

The data set used is not the best dataset out there, it’s been created with the help of Google and a little of common sense, also the number of images for folder is only 40/50, this means the algorithm does not have enough data to get the job done…but actually it does its job!

May the Force.com be with you!” [cit. Yodeinstein]

Powered by WordPress & Theme by Anders Norén