{"id":136,"date":"2024-02-10T08:58:12","date_gmt":"2024-02-10T08:58:12","guid":{"rendered":"https:\/\/zahiralam.com\/blog\/?p=136"},"modified":"2024-10-29T05:58:28","modified_gmt":"2024-10-29T05:58:28","slug":"step-by-step-guide-to-installing-ollama-on-mac","status":"publish","type":"post","link":"https:\/\/zahiralam.com\/blog\/step-by-step-guide-to-installing-ollama-on-mac\/","title":{"rendered":"Step-by-Step Guide to Installing Ollama on Mac"},"content":{"rendered":"\n<figure class=\"wp-block-embed is-type-rich is-provider-spotify wp-block-embed-spotify wp-embed-aspect-21-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe title=\"Spotify Embed: Step-by-Step Guide to Installing Ollama on Mac: Unlock AI Power Locally (Podcast)\" style=\"border-radius: 12px\" width=\"100%\" height=\"152\" frameborder=\"0\" allowfullscreen allow=\"autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture\" loading=\"lazy\" src=\"https:\/\/open.spotify.com\/embed\/episode\/4fBqKv9oke6eNGY7IPxheU?si=47ZNysoPQQm54v2HnGuIvg&#038;utm_source=oembed\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>\n\n\n\n<p>Ollama is a fantastic tool that allows you to run powerful large language models (LLMs) like <a href=\"https:\/\/zahiralam.com\/blog\/how-to-install-llama-3-1-on-mac-m1-m2-and-m3\/\">Llama 3.1<\/a>, <a href=\"https:\/\/zahiralam.com\/blog\/installing-llama-3-2-on-mac-m1-m2-and-m3-your-gateway-to-ai-power\/\">Llma3.2<\/a>,  Gemma 2, Code Llama and many more directly on your Mac. This means you can experiment with and use these AI language models without relying on cloud services or dealing with internet connectivity issues. \n\n\n\n<p>This article will guide you through the simple process of installing and running Ollama on your Mac.\n\n\n\n<h2 class=\"wp-block-heading\">Prerequisites:<\/h2>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Mac with macOS 11 Big Sur or later<\/li>\n\n\n\n<li>Internet connection for initial download<\/li>\n<\/ol>\n\n\n\n<p>\n\n\n\n<h2 class=\"wp-block-heading\">Installation Methods:<\/h2>\n\n\n\n<p>There are two main ways to install Ollama on your Mac:\n\n\n\n<p>\n\n\n\n<h3 class=\"wp-block-heading\">1. Downloading the App:<\/h3>\n\n\n\n<p>1. <strong>Visit the Ollama Website<\/strong>: Go to <a rel=\"noreferrer noopener\" href=\"https:\/\/ollama.com\/download\" target=\"_blank\">https:\/\/ollama.com\/download<\/a>\n\n\n\n<p>2. <strong>Download the Application<\/strong>:&nbsp; Click on the \u201cDownload for macOS\u201d button.\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-3.png\" alt=\"\" class=\"wp-image-898\" width=\"729\" height=\"539\" srcset=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-3.png 918w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-3-300x222.png 300w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-3-768x569.png 768w\" sizes=\"auto, (max-width: 729px) 100vw, 729px\" \/><\/figure>\n\n\n\n<p>3. <strong>Save the File<\/strong>: Choose your preferred download location and save the&nbsp;<code>.zip<\/code>&nbsp;file.\n\n\n\n<p>4. <strong>Locate the Download<\/strong>: After downloading, you might notice that the&nbsp;<code>Ollama-darwin.zip<\/code>&nbsp;file is automatically moved to the Trash, and the application appears in your Downloads folder as &#8220;Ollama&#8221; with the type &#8220;Application (Universal)&#8221;.\n\n\n\n<p>5. <strong>Open the Application<\/strong>: Navigate to your Downloads folder and double-click on the &#8220;Ollama&#8221; application. You will see a security prompt indicating that the application was downloaded from the internet.\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-4.png\" alt=\"\" class=\"wp-image-900\" width=\"563\" height=\"443\" srcset=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-4.png 742w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-4-300x236.png 300w\" sizes=\"auto, (max-width: 563px) 100vw, 563px\" \/><\/figure>\n\n\n\n<p>6. <strong>Security Prompt<\/strong>: Click &#8220;Open&#8221; in the security prompt to proceed with launching Ollama.\n\n\n\n<p>7. <strong>Move to Applications Prompt<\/strong>: After clicking &#8220;Open,&#8221; you will see a prompt indicating that Ollama works best when run from the Applications directory.\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-6.png\" alt=\"\" class=\"wp-image-902\" width=\"439\" height=\"408\" srcset=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-6.png 566w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-6-300x279.png 300w\" sizes=\"auto, (max-width: 439px) 100vw, 439px\" \/><\/figure>\n\n\n\n<p>8. <strong>Move to Applications<\/strong>: Click &#8220;Move to Applications&#8221; in the prompt to move Ollama to your Applications folder for optimal performance.\n\n\n\n<p>9. <strong>Launch the Application<\/strong>: Navigate to your Applications folder and launch Ollama. Once running, you will see the Ollama icon in the desktop menu bar, indicating that the application is running successfully.\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"169\" src=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-1024x169.png\" alt=\"\" class=\"wp-image-903\" srcset=\"https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-1024x169.png 1024w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-300x49.png 300w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-768x127.png 768w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-1536x253.png 1536w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-2048x338.png 2048w, https:\/\/zahiralam.com\/blog\/wp-content\/uploads\/2024\/08\/image-7-1800x297.png 1800w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>10. <strong>Access Ollama from Terminal<\/strong>: You can now access and control Ollama directly from the Terminal. For example, to download the Llama 3.1 8B model, use the following command:\n\n\n\n<div class=\"code-block-container\">\n                        <pre class=\"wp-block-code\"><code id=\"code-1\">ollama run llama3.1:8b<\/code><\/pre>\n                        <amp-iframe sandbox=\"allow-scripts\" width=\"94\" height=\"72\" frameborder=\"0\" \n                                    src=\"https:\/\/zahiralam.com\/blog\/wp-content\/plugins\/amp-copy-code-button\/copier.html#ollama%20run%20llama3.1%3A8b\">\n                            <button class=\"copy-button\" data-label=\"ollama run llama3.1:8b\"  placeholder disabled>Copy<\/button>\n                        <\/amp-iframe>\n                    <\/div>\n\n\n\n<p>For a detailed guide on installing the Llama 3.1 model on Mac M1, M2, and M3, you can refer to\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-zahirs-blog wp-block-embed-zahirs-blog\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"MhgYkrBgLj\"><a href=\"https:\/\/zahiralam.com\/blog\/how-to-install-llama-3-1-on-mac-m1-m2-and-m3\/\">How to Install Llama 3.1 on Mac M1, M2, and M3<\/a><\/blockquote><iframe loading=\"lazy\" class=\"wp-embedded-content\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; clip: rect(1px, 1px, 1px, 1px);\" title=\"&#8220;How to Install Llama 3.1 on Mac M1, M2, and M3&#8221; &#8212; Zahirs Blog\" src=\"https:\/\/zahiralam.com\/blog\/how-to-install-llama-3-1-on-mac-m1-m2-and-m3\/embed\/#?secret=3qVIrJsVU0#?secret=MhgYkrBgLj\" data-secret=\"MhgYkrBgLj\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><strong>Bonus Tip:<\/strong>&nbsp;If you\u2019re looking to enhance your experience by using a graphical interface, consider setting up Open WebUI, a user-friendly, browser-based interface that works seamlessly with Ollama. It allows you to manage your LLM runners offline with ease. For detailed instructions on how to set it up, check out my guide:&nbsp;<a href=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/\">Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery<\/a>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-zahirs-blog wp-block-embed-zahirs-blog\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"57AqiykgVo\"><a href=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/\">Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery<\/a><\/blockquote><iframe loading=\"lazy\" class=\"wp-embedded-content\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; clip: rect(1px, 1px, 1px, 1px);\" title=\"&#8220;Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery&#8221; &#8212; Zahirs Blog\" src=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/embed\/#?secret=GesXS1fLOY#?secret=57AqiykgVo\" data-secret=\"57AqiykgVo\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">2. Using Homebrew (Optional):<\/h3>\n\n\n\n<p>If you already use Homebrew, a package manager for macOS, you can install Ollama through the command line:\n\n\n\n<p>1. Open a Terminal window.\n\n\n\n<p>2. Run the following command:\n\n\n\n<div class=\"code-block-container\">\n                        <pre class=\"wp-block-code\"><code id=\"code-2\">brew install ollama<\/code><\/pre>\n                        <amp-iframe sandbox=\"allow-scripts\" width=\"94\" height=\"72\" frameborder=\"0\" \n                                    src=\"https:\/\/zahiralam.com\/blog\/wp-content\/plugins\/amp-copy-code-button\/copier.html#brew%20install%20ollama\">\n                            <button class=\"copy-button\" data-label=\"brew install ollama\"  placeholder disabled>Copy<\/button>\n                        <\/amp-iframe>\n                    <\/div>\n\n\n\n<p>3. Follow the on-screen instructions during the installation process.\n\n\n\n<p>\n\n\n\n<h2 class=\"wp-block-heading\">Running Ollama:<\/h2>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Open Ollama from your Applications folder.<\/li>\n\n\n\n<li>You will be presented with a terminal window within the app<\/li>\n\n\n\n<li>Choose a pre-installed model like &#8220;Mistral 7B Chat Model&#8221; or &#8220;Llama 2 OpenAI API.&#8221;<\/li>\n\n\n\n<li>Type your prompt in the terminal window and press Enter.<\/li>\n\n\n\n<li>Ollama will process your prompt and generate a response using the chosen model.<\/li>\n<\/ol>\n\n\n\n<p>\n\n\n\n<p>\n\n\n\n<h2 class=\"wp-block-heading\">Additional Tips:<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Ollama requires downloading the chosen model for initial use. This might take some time depending on your internet speed.<\/li>\n\n\n\n<li>You can explore other available models or even create your own custom models within Ollama.<\/li>\n\n\n\n<li>For detailed instructions and troubleshooting, refer to the official Ollama documentation: <a href=\"https:\/\/github.com\/ollama\/ollama\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/github.com\/ollama\/ollama<\/a><\/li>\n<\/ul>\n\n\n\n<p>By following these steps, you can easily install and start using Ollama on your Mac to unlock the power of large language models for your own exploration and experimentation.\n\n\n\n<p>\n\n\n\n<h3 class=\"wp-block-heading\">Frequently Asked Questions (FAQ)<\/h3>\n\n\n\n<p>Q1: How do I install Ollama on Mac?<br>A: You can install Ollama on Mac by either downloading the app from the official website (ollama.com\/download) or using Homebrew with the command <code>brew install ollama<\/code>. Detailed steps are provided in the article above.\n\n\n\n<p>Q2: Can I install Ollama using Homebrew?<br>A: Yes, you can install Ollama using Homebrew. Simply open a terminal and run <code>brew install ollama<\/code>.\n\n\n\n<p>Q3: What are the system requirements for Ollama on Mac?<br>A: Ollama requires macOS 11 Big Sur or later.\n\n\n\n<p>Q4: Does Ollama work on Apple Silicon Macs (M1, M2, M3)?<br>A: Yes, Ollama is compatible with both Intel and Apple Silicon Macs.\n\n\n\n<p>Q5: How do I run Ollama after installation?<br>A: After installation, you can run Ollama by opening it from your Applications folder or by using the terminal.\n\n\n\n<p>Q6: Where are Ollama models stored on Mac?<br>A: Ollama models are typically stored in the user&#8217;s home directory, but the exact location may vary. You can check the official documentation for specifics.\n\n\n\n<p>Q7: How do I update Ollama on Mac?<br>A: If you installed via Homebrew, you can update Ollama using <code>brew upgrade ollama<\/code>. For the app version, check the official website for the latest version and download it.\n\n\n\n<p>Q8: Can I use Ollama offline?<br>A: Once you&#8217;ve downloaded the models, you can use Ollama offline. However, the initial model download requires an internet connection.\n\n\n\n<p>Q9: How do I uninstall Ollama from my Mac?<br>A: If you installed via Homebrew, use <code>brew uninstall ollama<\/code>. For the app version, you can simply delete the Ollama.app from your Applications folder.\n\n\n\n<p>Q10: Is there a GUI for Ollama on Mac?<br>A: Yes, there are community-developed GUIs available for Ollama. A great option is Open WebUI, which provides a user-friendly, browser-based interface that works seamlessly with Ollama. You can easily set it up and start using it by following this guide:&nbsp;<a href=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/\">Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery<\/a>. Check the official documentation or community resources for more information.\n\n\n\n<p>Q11: Can I use Ollama with programming languages like Python or Node.js?<br>A: Yes, Ollama can be integrated with various programming languages. Check the official documentation for API usage and examples.\n\n\n\n<p>Q12: How do I start Ollama in the terminal?<br>A: Open a terminal window and type <code>ollama run<\/code> followed by the model name you want to use, e.g., <code>ollama run llama2<\/code>.\n\n\n\n<p>Q13: Where can I find more documentation on Ollama?<br>A: You can find detailed documentation on the official Ollama GitHub repository: <a href=\"https:\/\/github.com\/ollama\/ollama\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/github.com\/ollama\/ollama<\/a>\n\n\n\n<p>Q14: Can I install Ollama using Conda?<br>A: While Conda installation isn&#8217;t officially supported, you may be able to use Conda to create an environment and then install Ollama via other methods.\n\n\n\n<p>Q15: Is there an Ollama client for Mac?<br>A: Ollama itself serves as both a server and client on Mac. There are also third-party clients available.\n\n\n\n<p>Q16: How do I install Ollama models?<br>A: Models are typically downloaded automatically when you first use them. You can also manually install models using the command <code>ollama pull [model_name]<\/code>.\n\n\n\n<p>Q17: Can I use Ollama on Ubuntu or Windows?<br>A: Yes, Ollama is available for Ubuntu. Windows support is in development. Check the official website for the latest information.\n\n\n\n<p>Q18: How do I set up Ollama?<br>A: After installation, Ollama is ready to use. You can start it from the Applications folder or terminal and begin using models immediately.\n\n\n\n<p>Q19: Is there an Ollama app for Android?<br>A: As of now, Ollama is primarily for desktop operating systems. Check the official website for the latest on mobile support.\n\n\n\n<p>Q20: How do I use Ollama with React or React Native?<br>A: You can integrate Ollama with React or React Native applications by using Ollama&#8217;s API. Refer to the official documentation for integration guides.\n\n\n\n<p>Q21: Where is the Ollama installation location?<br>A: If installed via Homebrew, Ollama is typically in <code>\/usr\/local\/bin\/ollama<\/code>. For the app version, it&#8217;s in your Applications folder.\n\n\n\n<p>Q22: How do I exit Ollama in the terminal?<br>A: You can typically exit Ollama by typing <code>exit<\/code>, pressing Ctrl+C, or closing the terminal window.\n\n\n\n<p>Q23: Can I use Ollama for production?<br>A: While Ollama is powerful, its suitability for production depends on your specific use case. Consult the documentation and consider factors like licensing and performance requirements.\n\n\n\n<p>Q24: How do I troubleshoot &#8220;address already in use&#8221; errors with Ollama?<br>A: This usually means Ollama is already running. Try stopping existing Ollama processes or changing the port it uses.\n\n\n\n<p>Q25: Can I use Ollama to install Nemotron-Mini, developed by NVIDIA, on my Mac?<br>A: Absolutely! If you\u2019re interested in installing Nemotron-Mini on your Mac M1, M2, or M3, check out our quick-start guide specifically for Nemotron-Mini. It covers the installation process in a few easy steps:\u00a0<a href=\"https:\/\/zahiralam.com\/blog\/quick-start-install-nemotron-mini-on-mac-m1-m2-and-m3-in-minutes\/\">Quick\u00a0Start: Install\u00a0Nemotron-Mini\u00a0on\u00a0Mac\u00a0M1, M2, and\u00a0M3\u00a0in\u00a0Minutes<\/a>.\n\n\n\n<p>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusion<\/h2>\n\n\n\n<p>Congratulations! You\u2019ve successfully installed Ollama on your Mac and can now experiment with powerful large language models directly on your device. If you\u2019re looking to take your AI capabilities even further, consider setting up Open WebUI, a self-hosted, offline interface that works seamlessly with Ollama.\n\n\n\n<p>Open WebUI runs directly in your browser, providing an intuitive and easy-to-understand graphical user interface (GUI). This makes managing your LLM runners a breeze, even for those who may not be as familiar with command-line tools.\n\n\n\n<p>To get started, check out my detailed guide:&nbsp;<a href=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/\">Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery<\/a>.\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-zahirs-blog wp-block-embed-zahirs-blog\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"57AqiykgVo\"><a href=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/\">Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery<\/a><\/blockquote><iframe loading=\"lazy\" class=\"wp-embedded-content\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; clip: rect(1px, 1px, 1px, 1px);\" title=\"&#8220;Set Up Open WebUI with Ollama on Mac: Your Guide to Offline AI Mastery&#8221; &#8212; Zahirs Blog\" src=\"https:\/\/zahiralam.com\/blog\/set-up-open-webui-with-ollama-on-mac-your-guide-to-offline-ai-mastery\/embed\/#?secret=GesXS1fLOY#?secret=57AqiykgVo\" data-secret=\"57AqiykgVo\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>This guide will walk you through the process of setting up Open WebUI, so you can enjoy a smooth and efficient offline AI experience.\n","protected":false},"excerpt":{"rendered":"<p>Ollama is a fantastic tool that allows you to run powerful large language models (LLMs) like Llama 3.1, Llma3.2, Gemma 2, Code Llama and many [&#8230;]<\/p>\n","protected":false},"author":1,"featured_media":732,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[24],"tags":[17,21],"class_list":["post-136","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","tag-mac-m1-m2-m3","tag-ollama"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/posts\/136","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/comments?post=136"}],"version-history":[{"count":29,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/posts\/136\/revisions"}],"predecessor-version":[{"id":1388,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/posts\/136\/revisions\/1388"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/media\/732"}],"wp:attachment":[{"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/media?parent=136"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/categories?post=136"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/zahiralam.com\/blog\/wp-json\/wp\/v2\/tags?post=136"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}