1

Update: Editing the question title/body based on the suggestion.

Firebase store makes everything that is publicly readable also publicly accessible to the browser with a script, so nothing stops any user from just saying db.get('collection') and saving all the data as theirs.

In more traditional db setup where an app's frontend is pulling data from backend, and the user would have to at least go through the extra trouble of tweaking the UI and then scraping the front end to pull more-and-more data (think Twitter load more button).

My question was whether it was possible to limit users from accessing the entire database in a click, while also keeping the data publicly available.

Old:

From what I understand, any user who can see data coming out of a Firebase datastore can also run a query to extract all of that data. That is not desirable when data itself is of any value, and yet Firebase is such an easy to use tool, it's great for pretty much everything else.

Is there a way, or a best practice, for how to structure the data or access rules s.t. users see the data, but can't just run a script to download all of it entirely?

Thanks!

Kirill
  • 3,667
  • 4
  • 30
  • 35
  • I think you're misunderstanding how security rules work. You can configure things so that only authenticated users can read what you want them to read, which could be different between users, and different for unauthenticated users. Please read up on security rules to get a better sense of how things work before going any further. https://firebase.google.com/docs/firestore/security/get-started – Doug Stevenson Mar 21 '19 at 05:47
  • I have used Firebase as a database for private apps, where data is shared user-to-user, where it has worked great. Now I am trying to figure out if there's a way to put breaks in place where if users create public data, then other users can see it, but can't download all of it via a simple db.get('collection') script. Do security rules support limits (say allow to get 50 records per user per session) ? – Kirill Mar 21 '19 at 05:55
  • Would you like to edit your question to state what you're actually asking here? – Doug Stevenson Mar 21 '19 at 07:20

1 Answers1

2

Kato once implemented a simplistic rate limit for writes in Realtime Database security rules: Firebase rate limiting in security rules?. Something similar could be possible in Cloud Firestore rules. But this approach won't work for reads, since you can't update the timestamp at the same time the read is performed.

You can however limit what queries a user can perform on your database. For example, to limit them to reading 50 documents at a time:

allow list: if request.query.limit <= 50;
Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
  • 1
    So then the attacker just makes multiple queries with limit 50 for as long as it takes to get the whole collection? – Doug Stevenson Mar 21 '19 at 19:08
  • 1
    Yup. Feel free to write an alternative answer, that's probably dependent on auth. – Frank van Puffelen Mar 21 '19 at 19:34
  • Right, I also just realized that perhaps the easiest solution to what I was looking for is to let user write via the front end as they normally do, and then perform reads from the server, where I can lock the number of queries. It's more work, but this way I can still populate Firebase with user data, and not worry about them downloading the whole set. – Kirill Mar 21 '19 at 23:10
  • 2
    If your backend is exposed via an HTTP endpoint, that also can be abused. Any point of exposure might be a liability. – Doug Stevenson Mar 22 '19 at 00:01
  • 1
    If you want to impose read rate limits, then a Cloud Function might a good option. But before doing that, I'd consider simply limiting the data you expose to your users to begin with. As Doug said: anything they can access can be abused, whether it's through Cloud Functions or directly – Frank van Puffelen Mar 22 '19 at 00:39