Cosmicomical

@Cosmicomical@kbin.social

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Cosmicomical,

That's the problem, then. You shouldn't store entities in the db, the table is likely already utf8, which supports all characters

Cosmicomical,

Sorry for the late reply, but the point is that there is no trivial way to detect whether and how many times something has been encoded. You may end up with multiple levels of encoding in multiple systems and everything becomes untractable. Morever, as i said this doesn't have to be a problem, as you can just decode everything as much as you can BEFORE you put it in the db, as the db can handle all of that by itself. Just let it do its job. Paradoxically, if you use only channels that support utf8 and don't apply any transformation, your data is already perfect as it is. Then it is the job of the client to do what it needs to be able to render properly, but for instance a non-html client shouldn't need to use html libraries to be able to strip html stuff from the text before it can be displayed.

Cosmicomical,

Sorry for the late reply, it's been a week... but yeah passing creds in the Get is very bad for multiple reasons. For instance if you pass the creds on a page that contains ads or trackers, they are probably going to store the url AND your credentials and propagate them to a million systems of third parties. That's. Not. Good.

Cosmicomical,

To be honest it's already incredible that the platform works at all and has all these features. Great job, really! I'm not being sarcastic, it needs improvement but it's a great achievement.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #