When should you use SQL instead of MongoDB?

Jason Voorhees

Jason Voorhees

Mod/chef of the Narcy pirates 🏴‍☠️🏴‍☠️
Staff
Joined
May 15, 2020
Posts
33,117
Reputation
79,668
I have been building a few backends with Node.js recently, and always used MongoDB. I have used MySQL before, but mostly because I had no idea what I was doing and that is the only DB I had every heard of.

I know that SQL databases are still very widely used, so there must be something I am missing, but I feel like MongoDB is just always the better choice for JS based programs, since the JSON objects are way easier to work with.

Are SQL and Relational databases that much better outside of big data applications? Should i try to implement them more. Querying the databases is much easier with MongoDB tbh

@User28823 @gooner23
 
  • +1
  • JFL
Reactions: ReadBooksEveryday, HumidVent and TechnoBoss
@hello12344
 
Seems like no one on this forum is interested in backend development
 
  • JFL
  • So Sad
  • +1
Reactions: efidescontinuado, normie_joe, horizontallytall and 2 others
Seems like no one on this forum is interested in backend development
Yes, indeed. Still thinking about dropping stemcuckery but based military awaits which is a worse scenario for me.
 
  • +1
Reactions: Jason Voorhees
@WeiWei
Yes, indeed. Still thinking about dropping stemcuckery but based military awaits which is a worse scenario for me.
I am personally doing it not just for financial prospects but also because I like problem solving. I am like challenging myself mentally atho stem can get very stressful and toxic sometimes yes
 
  • +1
Reactions: St.TikTokcel
@WeiWei

I am personally doing it not just for financial prospects but also because I like problem solving. I am like challenging myself mentally atho stem can get very stressful and toxic sometimes yes
I just had nowhere to go. Muh DML/DCL/DDL, muh foreign key :soy:
 
  • +1
Reactions: Jason Voorhees
Very good question for a forum who’s interested in fucking women 24/7.
 
  • JFL
Reactions: dna_cel, Megas Alexandros, laaltin and 5 others
Whether to use SQL or MongoDB (or any other NoSQL database) depends on your specific requirements:

  • If you need strong consistency, complex transaction management, and complex queries, and your data is highly structured, a relational database might be the better choice.
  • If you're developing in a JavaScript-heavy ecosystem, need to handle large volumes of unstructured or semi-structured data with flexible schemas, or require easy scalability, MongoDB could be advantageous.
 
  • +1
  • JFL
  • Love it
Reactions: efidescontinuado, Pakicel, normie_joe and 2 others
But to answer your question, yes, Mongo displays data as JSON documents whilst muh SQL has the classical variant of tables (rows and columns)
 
  • +1
Reactions: Jason Voorhees
I have been building a few backends with Node.js recently, and always used MongoDB. I have used MySQL before, but mostly because I had no idea what I was doing and that is the only DB I had every heard of.

I know that SQL databases are still very widely used, so there must be something I am missing, but I feel like MongoDB is just always the better choice for JS based programs, since the JSON objects are way easier to work with.

Are SQL and Relational databases that much better outside of big data applications? Should i try to implement them more. Querying the databases is much easier with MongoDB tbh

@User28823 @gooner23
It's crazy how much sql is still used today, I walk around the floor when I do my rounds and see 30% of people writing SQL in SAS EG...
 
  • +1
Reactions: Jason Voorhees and TechnoBoss
I'm a beginner and was just encountering this issue. Apparently I'm using Node.js to upload all of my JSON files without needing to use a database like SQL. When I tried using SQL, it was taking too long to build the database, so it seemed inefficient.
 
  • +1
Reactions: Pakicel and Jason Voorhees
copypasta
 
  • +1
Reactions: Jason Voorhees
@WeiWei

I am personally doing it not just for financial prospects but also because I like problem solving. I am like challenging myself mentally atho stem can get very stressful and toxic sometimes yes
Whats challenging in tech lmao, just shifting bitd around..go to cutting edge physics or HF trading if you want to problem solve and shizz
 
  • +1
Reactions: Jason Voorhees
I mostly use sql if I’m working w databases, but mongo not bad
 
  • +1
Reactions: Jason Voorhees
mongodb is for larger data like forms and unstructured data use sql for anything else
 
  • +1
Reactions: Jason Voorhees
Mongo is such a hard language
SQL is something that a layman can learn.
 
  • +1
Reactions: Jason Voorhees
Mongo is such a hard language
SQL is something that a layman can learn.
Anyone can execute sql commands nigger but they don't accomplish anything even with the use of natural join, self join, sub queries. They only make for boring class lectures in CS course. PLSQL commands are actually what is used when querying databases in companies. Read the syntax for that and come back.
 
Last edited:
  • +1
Reactions: Pakicel
Anyone can execute sql commands nigger but they don't accomplish anything even with use of natural join, self join, sub queries. They only make for boring class lectures in CS course. PLSQL commands are actually what is used when querying databases in companies. Read the syntax for that and come back.
mongodb is for larger data like forms and unstructured data use sql for anything else
 
  • +1
Reactions: gooner23
As If u tard are smart enough for programming
 
As If u tard are smart enough for programming
You have been to the wards over 20 times in last 3 years. I wouldn't talk shit if u were you
 
  • JFL
Reactions: Magnum Opus
when you want to store semi/unstructured data
 
  • +1
Reactions: Pakicel
I'm a beginner and was just encountering this issue. Apparently I'm using Node.js to upload all of my JSON files without needing to use a database like SQL. When I tried using SQL, it was taking too long to build the database, so it seemed inefficient.
What are you using the parse the json files? The inbuilt functions provided by the RDBMS? IDK if you are using Oracle but maybe you try inserting the JSON objects as a CLOB datatype and then parsing them?

If that doesn't work, try using loading those json objects into pandas (python) to see if it speeds things up and then insert those dataframes into the DB.
 
Last edited:
It's better to use SQL in the majority of cases. With noSQL DBs like MongoDB, its harder to enforce a schema and joining different collections is much slower. The only real benefit is that a noSQL DB is easier to scale horizontally as collections can be split across multiple shards but I doubt you are dealing with enough a large enough volumne of data for this to be a concern for you.
 
Btw you are right that you don't need to parse or clean the JSON object and you can just insert it into the collection raw. But the problem is it's harder to enforce data quality. Like what if the structure of the source JSON object changes? Like maybe you had a key-value pair in the JSON like this {"ABC":123} but then it is changed to {"ABC-v2":123} or something like that and what if you want do some kind of aggregation downstream? For a SQL DB, you would have to either a specify a new column or do some parsing to map it to an existing column but I guess MongoDB wouldn't flag it at all. Granted I haven't really used MongoDB much so I am not sure but I have used another quite a bit and from what I remember you can't just use an alias to refer to multiple fields so you would have to be vigilant about schema changes and catch them beforehand or those badly mapped fields would have to be re-indexed. I mean you could just account for it when querying but that is prolly not best practice.
 
Last edited:
Anyone can execute sql commands nigger but they don't accomplish anything even with the use of natural join, self join, sub queries. They only make for boring class lectures in CS course. PLSQL commands are actually what is used when querying databases in companies. Read the syntax for that and come back.
Yup. If you are using Oracle at your company, there is going to be a fuck ton of PL/SQL. I am not a fan of it tbh because it's difficult to version control and lacks the flexibility I would want for what I use it for. Like the PL/SQL procedures are coupled with the DB so you can't just package them into a container to test and deploy as you wish. And I can't just say install some library to do whatever I want like in python. That being said, Oracle is being phased out as many companies are moving to cloud-native warehouses like snowflake which use DBT where PL/SQL would have been used. Oracle is likely still gonna be around in more old-school companies though for a while.
 
It's crazy how much sql is still used today, I walk around the floor when I do my rounds and see 30% of people writing SQL in SAS EG...
Probably a bank or some other old-school financial services firm. SAS is legacy AF from what I have heard.
 
What are you using the parse the json files? The inbuilt functions provided by the RDBMS? IDK if you are using Oracle but maybe you try inserting the JSON objects as a CLOB datatype and then parsing them?

If that doesn't work, try using loading those json objects into pandas (python) to see if it speeds things up and then insert those dataframes into the DB.
JavaScript using Node.js

I think I'm just using Node.js to parse through a lot of JSON files and having them uploaded to a server.

IIRC, I was using Python to create a database and then tried using Java to see if that sped up the process, but gave up. I'm building an actual table (similar to an Excel table) with a lot of JSON files (potentially millions), so I'm not sure if what I'm doing is most efficient.
 
  • +1
Reactions: Pakicel
JavaScript using Node.js
Haven't used JavaScript much but there must me kind of library which is an API to another language that was written specifically for data processing? I have never heard of JS being used for data-intensive tasks. Maybe it isn't the right tool for what you want to do?
I think I'm just using Node.js to parse through a lot of JSON files and having them uploaded to a server.
How big is each file?
IIRC, I was using Python to create a database and then tried using Java to see if that sped up the process, but gave up. I'm building an actual table (similar to an Excel table) with a lot of JSON files (potentially millions), so I'm not sure if what I'm doing is most efficient.
What do you mean by create a database? Do you mean creating a client using a python library, connecting to the db and running insert queries? Java can be quciker than python but the bottleneck here seems like how quickly you are able to process each JSON object. If you use the right python libraries, doing it in python can be faster.

If you have a lot of JSON files less than <10 GB and the JSON objects aren't too nested then just load each of them into a pandas dataframe. Then you can insert them into the SQL DB. What do you mean by actual table? A CSV file? Or do you mean a spreadsheet? Both are terrible ideas. Flat files and spreadsheet applications are not meant to hold huge amounts of data. Like excel can't handle more than a million rows of data.
 
Haven't used JavaScript much but there must me kind of library which is an API to another language that was written specifically for data processing? I have never heard of JS being used for data-intensive tasks. Maybe it isn't the right tool for what you want to do?
These are what I'm using for my server.js file. I run it using Node.js:
Code:
const express = require('express');
const fs = require('fs');
const path = require('path');
const math = require('mathjs');
const app = express();

How big is each file?
1–200 KB. They could get larger, but not by much. I'm uploading ~500,000 of them, but that will likely increase to one or two million in the future.

What do you mean by create a database? Do you mean creating a client using a python library, connecting to the db and running insert queries? Java can be quciker than python but the bottleneck here seems like how quickly you are able to process each JSON object. If you use the right python libraries, doing it in python can be faster.
I was creating the database locally (with local JSON files), and then presumably uploading that database to the server.

If you have a lot of JSON files less than <10 GB and the JSON objects aren't too nested then just load each of them into a pandas dataframe. Then you can insert them into the SQL
They are barely nested. I may go back to trying to do this.

What do you mean by actual table? A CSV file? Or do you mean a spreadsheet? Both are terrible ideas. Flat files and spreadsheet applications are not meant to hold huge amounts of data. Like excel can't handle more than a million rows of data.
I'm designing a table for a webpage that has increased functionality compared to Excel.
 
  • +1
Reactions: Pakicel

Similar threads

moggr2009
Replies
28
Views
487
Chicoaus123
Chicoaus123
lestoa
Replies
47
Views
5K
W3ak
W3ak
Baban
Replies
29
Views
4K
Allornothing
Allornothing

Users who are viewing this thread

Back
Top