Yarle: Converts Evernote notes to Markdown (Free app)

Table of Content

Yarle is the ultimate converter of Evernote notes to Markdown.

Features

  • πŸ“ Any text
  • πŸ“ All metadata: original creation time, last modification time, tags, GPS location, notebook name, source URL
  • πŸ”— External links
  • πŸ”— Internal links among Evernote notes
  • πŸ’» Codeblocks
  • πŸ–ΌοΈ Inline Images
  • πŸ“Ž Attachments
  • πŸ“„ Webclips

Customization

  • πŸš€ Creates Markdown files matching to user-defined templates, see Templates introduced. See How to use templates with YARLE for details.
  • πŸ’‘ Metadata support: Puts title, creation time, update time, tags, and latlong source, notebook, link to original html meta-information into md as metadata. (To set them, please set up a custom template)
  • πŸ”¨ Updates md files' creation, access, and modification timestamps according to the notes' original create/update/modification time.
  • πŸ”¨ Organizes all attachments into a _resources subfolder (to keep the notes' folder as simple as possible).

Works with

  • πŸ““ single enex file (one notebook exported from Evernote)
  • πŸ“š or a folder of enex files supported (several notebooks exported and placed into the same folder locally)

Imports your notes to

  • Obsidian
  • LogSeq
  • Tana

Platforms

macOS Windows and Linux (Debian, Ubuntu, Linux Mint)

License

MIT license

Tags

tool,tools,markdown,evernote,notes,note,bookmarks,system,systems,pdf

Resources

Github








Open-source Apps

9,500+

Medical Apps

500+

Lists

450+

Dev. Resources

900+

Read more

Why We're Betting Big on DeepSeek-V3: A Personal Dive into the Open-Source AI That’s Changing the Game and Redefining AI Excellence

Why We're Betting Big on DeepSeek-V3: A Personal Dive into the Open-Source AI That’s Changing the Game and Redefining AI Excellence

In a bold challenge to AI giants like OpenAI, DeepSeek has unleashedΒ DeepSeek-R1β€”a revolutionary open-source model that marries brute-force intelligence with surgical precision. Boasting 671 billion parameters (only 37B active per task), this MIT-licensed marvel slashes computational costs while outperforming industry benchmarks in coding, mathematics, and complex reasoning. With

By Hazem Abbas