AI
Vlm Run by Maxmel Tech
14 min
vlm run offers a unified api for visual ai with structured outputs, agentic ai support, and real time analytics, enabling developers to quickly deploy visual ai solutions integrating with make com streamlines automation and expands visual ai capabilities for users this is ai generated content based on official vlm run documentation the content may still contain errors—please verify important information if you have questions, contact https //www vlm run/ directly how to get support on vlm run vlm run is a community developed application and is subjected to the developer's terms and conditions, which may include applicable fees make does not maintain or support this integration for assistance, please https //f make com/r/reachout?app name=vlm%20run\&app slug=vlm run community\&app unique id=vlm run l25hv4 requirements to use the vlm run app in make, you must sign up for an account and obtain an api key, https //www vlm run/ app installation to install this app, you need admin, owner, or app developer permissions docid foycaspyp9uykgm7lqpb go to the https //www make com/en/integrations/vlm run community/ click the install button follow the on screen instructions to complete the setup connect vlm run and {{product name}} to get started, you must first create a connection between vlm run and {{product name}} , allowing the two services to communicate you can connect using the following method api keys api keys instructions you need to retrieve your vlmrun api key from the dashboard login to your platform using your credentials navigate to the dashboard locate the section where your api keys are listed copy your vlmrun api key some community developers require an additional specific api key retrieve the required api key from the community developer's dashboard, generally found under settings/api for more info, please https //f make com/r/reachout?app name=vlm%20run\&app slug=vlm run community\&app unique id=vlm run l25hv4 create the connection in {{product name}} https //www make com/en/register , add the vlm run module to your {{scenario singular lowercase}} , and click create a connection if you add a module with an instant tag, click create a webhook , then create a connection optional in the connection name field, enter a name for the connection enter the authentication credentials you copied from your vlm run or follow the on screen instructions to create the connection you must also enter any required third party api keys obtained earlier for detailed information on connecting an application, see docid\ so88fm6pkt0g adkddfzz page vlm run modules after connecting to the vlm run app, you can choose from a list of available modules to build your {{scenario plural lowercase}} analyze an image creates a detailed analysis by generating a structured prediction based on the content of the provided image get a completion fetches and provides the detailed results from a previous parsing or transcription process, allowing you to access the extracted or converted data list files fetches and displays a list of all files that have been uploaded, allowing you to view and manage your uploaded documents make an api call allows you to make a custom api request to the connected service using your authorized account, giving you flexibility to access features or data not covered by the standard modules parse a document creates detailed, organized predictions based on the content of the provided document parse a video creates detailed, organized predictions based on the content of the provided video transcribe an audio creates detailed, structured predictions by analyzing the provided audio file upload a file uploads a selected file directly to your vlm account, making it available for use within the platform watch completed predictions this module activates whenever a prediction process finishes, allowing you to take action as soon as the prediction results are available templates you can look for more templates in https //www make com/en/templates , where you'll find thousands of pre created {{scenario plural lowercase}} vlm run resources you can have access to more resources related to this app on the following links https //www vlm run/ https //docs vlm run/introduction https //docs vlm run/api reference https //www make com/en/integrations/vlm run community