Posts tagged with “javascript”

A Firefox extension to boost my job hunt.

This post is a bit different from what I usually share.

Usually, I work with ecological data and AI, but I wanted to take a break and brush up on my JavaScript and web dev skills. So, I made a little browser extension to help with job hunting.

Job searching can get messy fast — tons of tabs, job links everywhere, and trying to match your CV to each offer. It’s easy to lose track.

That’s why I built this JobSeeker Companion: a simple Firefox extension that keeps your job search tidy and helps you spot which jobs fit your CV. No tracking, no fuss — just a handy tool to make things easier.

📋 1. Clipboard Helper

Easily copy job offer URLs as you browse. No need to dig through your history later—your links are all stored in one place.

💾 2. Save Offers

Like what you see? Save job descriptions directly from the browser and revisit them anytime.

🧠 3. Match Your CV

Paste your CV into the extension once, then check how well it matches any job offer you’re viewing. The tool highlights missing keywords to help you fine-tune your applications.

🔒 Privacy First

One important aspect I want to highlight: your data stays private. This extension does not store any of your CV or job description data anywhere. All processing happens locally on your machine, with no calls to external AI services or servers. Your information never leaves your browser, so you can use the tool with confidence and peace of mind.

⚙️ Tech used

Built as a Firefox extension using Manifest V3 and WebExtensions API with JavaScript. It injects scripts to scrape job descriptions and performs simple token-based matching against your CV. The UI uses chrome.i18n for localization.

🧪 Try It Now

The extension is available on Firefox Add-ons and on my GitHub.

🚀 What’s Next?

Future updates might include semantic similarity analysis using embeddings to improve matching accuracy and support for more job platforms — currently, the extension scrapes entire pages except on LinkedIn, where it targets only the job description panel.

If you have ideas or want to help develop these features, feel free to reach out or contribute on GitHub!

🙌 Support the Project

If this tool helps you during your job search, you can buy me a coffee ☕💖.


Creating a Fish-Focused Chatbot with OpenAI and GBIF API: A Step-by-Step Guide !

In a previous post, I detailed how to build a fish information-fetching web application with Flask, which you can check out here. Now, I’m excited to take it a step further and explore how to create a specialized chatbot focusing on marine science, particularly fish species and their distribution. 🐠

Chatbots have become an integral part of user interaction, providing instant answers and improving user experience. In this article, I'll explore how to create a specialized chatbot that focuses on marine science, particularly fish species and their distribution. This project leverages OpenAI's powerful language model and the Global Biodiversity Information Facility (GBIF) API to provide accurate and detailed responses.

Project Overview

The objective of the chatbot is to respond to inquiries about marine species, habitats, and behaviors while also being able to suggest follow-up questions and retrieve taxonomic data. The bot will be designed to handle specific questions, guide users to deeper knowledge, and ensure that all interactions maintain a structured format. This has been the most challenging aspect, as LLMs often struggle to retain prior knowledge."

Step 1: Setting Up the Environment

Before diving into coding, ensure you have the necessary tools installed:

  • Node.js: This will help run our server-side code.
  • OpenAI API Key: You need an account with OpenAI to access their models. Secure your API key from the OpenAI website.
  • GBIF API: This API will allow us to retrieve taxonomic data on fish species.

Step 2: Building the Chatbot

Here’s how I structure the chatbot using JavaScript:

Importing Dependencies

We will use the openai package for accessing the OpenAI API and fetch the GBIF API.

import { process } from '/env.js';
import { Configuration, OpenAIApi } from 'openai';

Configuration

Set up the OpenAI configuration with the API key.

const configuration = new Configuration({
    apiKey: process.env.OPENAI_API_KEY
});

const openai = new OpenAIApi(configuration);

Defining the Expected Response Format

We create a structure that the bot will follow for its responses:

const expectedFormat = {
    response: '',
    keywords: [],
    follow_up_questions: [],
    taxon: 'none'
};

Conversation Array

The conversation history is stored in an array, which includes a system message that outlines the bot's purpose and constraints.

const conversationArr = [
    {
        role: 'system',
        content: `You are an expert in marine science...`
    }
];

That's here that I can constrain the behavior of the bot, making it not really fun at parties:

Step 3: User Input Handling

To interact with users, we create an event listener that captures input and processes it:

document.addEventListener('submit', (e) => {
    e.preventDefault();
    const userInput = document.getElementById('user-input');   
    // Create speech bubble and add user input to conversation
    ...
    fetchReply(); // Call the function to get the bot's reply
});

Step 4: Fetching Replies from OpenAI

The fetchReply function sends the user's input to the OpenAI API, processes the response, and updates the conversation history:

async function fetchReply() {
    const response = await openai.createChatCompletion({
        model: 'gpt-4',
        messages: conversationArr,
    });
    // Process and handle the response
    ...
}

Step 5: Taxonomic Data Retrieval

To enrich responses with relevant taxonomic information, we call the GBIF API based on the taxon extracted from the chatbot's reply:

async function getTaxonKey(taxon) {
    const url = `https://api.gbif.org/v1/species/match?name=${encodeURIComponent(taxon)}`;
    ...
}

If the taxon is found, a distribution map will be plot on the left, with the corresponding API call. I’ve opted for one of the many styles available from the GBIF API, but the others are really cool too!

Step 6: Rendering Responses and Suggestions

Once the bot has generated a reply, we render it on the UI, implementing a typewriter effect for enhanced user engagement. We also provide suggestion buttons for follow-up questions:

function renderTypewriterText(text) {
    // Create a typewriter effect for the bot's reply
    ...
}

function renderSuggestions(keywords, follow_up_questions) {
    // Create suggestion buttons for follow-up questions
    ...
}

Step 7: Testing and Iteration

After implementing the chatbot, thoroughly test it with various inputs to ensure it functions as intended. Pay attention to how it handles edge cases, such as invalid queries or API errors.

Next Steps

Looking ahead, I’d like to connect the bot to the FishBase API, which would provide more quantitative data about fish species. This would make the interaction even more informative. I also think it would be great to include scientific references alongside the bot’s responses, so users can dig deeper into the research if they want to learn more.

The whole project is available on my GitHub: Chatbot Whaly.