Hacker Newsnew | past | comments | ask | show | jobs | submit | kkdaemas's commentslogin

Surely it's easier to stick to JavaScript The Good Parts than to introduce a whole new layer in the form of Blazor?


Blazor is not equal to JavaScript. Blazor is Microsoft's answer to React and Flutter. So far of the three, I like Blazor, but not the server version that uses SignalR to communicate with the client because it locks you in. Blazor Webassembly is awesome, but just needs a good DOM manipulation library. With Blazor Wasm on the client, I can use anything on the back end.


The question is very unrealistic... how often do you know there is exactly one duplicate in a list?

Much more common is there are zero or more duplicates.

Here is what I came up with:

- We need a way to record what has been already seen

- A hash-map could work, but I think we can be more efficient

- We know that all elements are less than the array length in size...

- So we can allocate a single array of flags with the same length as the input

- The flag for value `n` is at position `n` in this array

    let findDuplicates xs =
      let seen = Array.create (Seq.length xs) false

      let mutable duplicates = []

      for x in xs do
        if seen[x] then
          duplicates <- x :: duplicates
        else
          seen[x] <- true

      duplicates
I'm pretty sure this is O(n)?

I think it's interesting that the mathematical trick (sum of numbers 1 to n formula) does not work in the more realistic variant. This fact is probably why leet-code problems are so disconnected from the real world. It's like AI for board-games.


The question isn't really about finding duplicates in a list, but a slightly backhanded way to ask "do you remember high school math and can you apply it". A less artificial problem would just make things more complicated.

Also, your way is O(n) memory when O(1) would be enough for the original question.


Always start a programming interview by asking questions. You know there is exactly one duplicate in the list because you wouldn’t start coding without thinking about what the problem left unspecified and getting the interviewer to clarify.


This is true cyberpunk.


No because the metadata on the deleted resource is now lost.

With the information being stored outside the resource, we know that it was deleted and the metadata about it.


That's not state information though, that's metadata.


If some state information is stored using metadata then we need that metadata to know the total state. The contents of TF state is more than what can be read from the provider APIs often.


Yeah but the extra stuff they add is a trade off and lots of people would rather not have two sources of truth to have it


Most of the people I know who use Terraform use Terraform as the source of truth.


You can wish things are a certain way but that doesn't change reality no matter how nice that would be.


Seen this many times. People using a tool in an unintended manner and then bashing the tool.


Huh so you access your infra through terraform?


This conversation is embarrassing for both of you.


Tell me more please. There’s always something new to learn.


its super embarrassing if you thing TF is the source of truth when the cloud provider clearly is


My infrastructure drifts from my source of truth.


Through an API, really.


This strategy would miss some huge and unexpected gains. Tesla comes to mind (at least for the time being...)


On other hand it also avoids Tesla when it inevitably crashes to same ballpark as other automotive companies... No I really believe it is nothing special and will eventually come down.


It's not that GraphQL enables anything that was not possible before; it's that GraphQL provides a bit more structure and standardization around these things. If you plan to do this stuff, why not follow something with a spec and various bits of tooling rather than doing it ad-hoc?


The problem is Bitcoin Core may have (will have) unknown vulnerabilities and those might not get patched correctly once discovered.


Yes, this is true, I'm mostly worried about the elliptic curve signature part, as everything else could be fixed with an emergency hard fork (except SHA256).

Sadly OP_CAT operation is disabled (or substring equality operation), which would make lamport signatures available again for high value transactions. I would love it if lamport signatures would be enabled again (it would be quite easy to do), but I'm afraid that there isn't enough concensus to do it at this point, because some people would think that it's wasteful, and also lamport signatures are dangerous, as they can be used only once.


That can be desirable but there are a few challenges:

- The compiler code becomes more complicated, making correctness harder

- The compiler might become slower to run

- Introducing new languages features may become harder, again due to code complexity


Compiler will get plenty complicated without IDE scenarios, trust me on that one. Slowness is also never really a thing to worry about here, especially because usage patterns in an IDE vs. a batch process are so different. It's almost always the other way around: someone writes something that's completely fine for a batch process but tanks IDE performance.


> The compiler code becomes more complicated, making correctness harder

> Introducing new languages features may become harder, again due to code complexity

It'll be written for IDEs anyway. Might as well reuse if possible, right?


Here's an idea for the EU: mandate that all major browsers ship with third-party cookies disabled by default and drop the whole cookie-banner nonsense.


Other idea: make browsers have a proper cookie banner and not one that tricks me into selling my soul, I never got why pages would need individual banners.


Evil sites will use localStorage or some third party API and continue tracking you.

I am sure people here will find at least 20 solutions on the problem on "how can a group of evil websites track a user across if cookies do not work but JS is On", the solution would involve something like drop this lines in your html page and the js code there will connect to some server and store some fingerprint there, Google might decide to give your browser a fingerprint to help with their ad business.


I wouldn't approach this problem from a technical direction.

If there is a browser based vendor agnostic opt-in popup for user tracking (not only cookies) you can outlaw and severely punish attempts to circumvent that.

Given the time and resources courts really dislike the "welllll technically..." Argument.


For GDPR both localStorage, fingerprinting and other methods are all equal to cookies. Even IP tracking is the same as a cookie.

"Cookie Banner" is just tech-jargon for these banners, but an incorrect one.


I know, I tried(and probably failed) to explain to OP why his simple idea to "just make the browsers disable third pary cookies" or other technical solutions are not going to work, you need GDPR like laws to focus on the actual problem and not some technical implementation because developers will find workarounds for technical only stuff.

Browsers could help by implementing a standard GDPR popup for this shitty websites to share , at least it will not be same dark pattern UX, broken implementation shit this sites use today.

Browsers could do a lot of good things if they would focus on the actual users needs and not on what some developer feels cool to work on or what soem giant company ants to implement next.


Gotcha. FYI, the post you replied to suggested "make browsers have a proper cookie banner", which seems like you agree with? The one that says "disable third party cookies" was two levels above your post.


>Gotcha. FYI, the post you replied to suggested "make browsers have a proper cookie banner", which seems like you agree with? The one that says "disable third party cookies" was two levels above your post.

Ah, sorry ,I messed up. I am trying to force myself to always quote the text I am replying, sometimes I do not do it and is causing issues, I will try to do better.


You can simply make that illegal.


that's the point of webbugs, https://webbug.eu/ no cookies needed - actually JS not needed but it sure is useful.


Please don't force browsers (clients) to fix what's fundamentally a server side (website doing shit with your data) issue.

Browser can choose to respect the cookies (first, or third parties), but ultimately don't force them to do or not do anything.


Forcing 3 companies to change their browsers is a lot easier than forcing millions of shady US businesses to do anything


It would be nice to have browser support for cookie popup that is uniform and not worded differently everywhere. Maybe even a default setting and ability to auto-reject. The popups have ruined the experience.


> Maybe even a default setting and ability to auto-reject.

We sort of have auto-reject, with the Do Not Track header. Which pretty much everyone has decided to ignore, because then people just say no and that's not the result they want.

> The popups have ruined the experience.

And behind every popup is a company that decided that ruining your experience was the correct thing to do.


> And behind every popup is a company that decided that ruining your experience was the correct thing to do.

Yes, of course, they want you to think "uugh privacy just means lots of work and popups I'll just click accept"


The problem is that both the client and the server is controlled in large part by Google and they like to optimize the user experience into whatever allows them to sell ads.

Lynx is about the only browser that still notifies you and has you accept each cookie manually.


most sites would not need a 'cookie banner'... unless they wish to track you/mine your data/etc.


As long as the most widely used browser is owned by Google? No way that could possibly end up being intentionally broken and misleading. The law would have to specify the exact shape of the cookie dialogue down to the pixel and I still would expect Google to find a way to fuck it up.


> I still would expect Google to find a way to fuck it up.

Playing cat and mouse with only a few large entities vs. literally every website on the web seems like progress.

And let's be realistic, "intentionally broken" can be prevented by having a serious deterrent and removing the incentive.


> The law would have to specify the exact shape of the cookie dialogue down to the pixel and I still would expect Google to find a way to fuck it up.

In the EU it's more usual for judges to take the "spirit of the law" into account for rulings rather than the "letter of the law" that is more common in Common Law systems.

I don't know enough and IANAL to state that with sureness about the whole legal system of all EU countries but it's a rule-of-thumb, the law doesn't need to be absurdly specific to avoid loopholes, it just needs to be good enough to cover ground for judges to judge if the accused is following its spirit.


> The law would have to specify the exact shape of the cookie dialogue down to the pixel

Sounds good to me.

> I still would expect Google to find a way to fuck it up.

Sure, then we change the law again and/or sue Google.


> Sure, then we change the law again and/or sue Google.

Which generally seems to have an almost 10 year delay for every iteration since Google will appeal on every instance and do its best to slow down every curt issued request heading its way to the fullest amount possible. The result: Not happening in the next century or two.


That is... doubling down on a bad idea. Moving the stupid cookie banners to the browser itself so we can not block them. It's so idiotic, the EU bureaucrats will probably consider it.


I think it's the other way around, if the cookie banners were implemented at the browser level, there would be "auto-reject" extensions on day 0. Or, worst case, auto-rejecting forks of Chromium and Firefox.


Cookies, including first-party ones, that are not "strictly necessary in order to provide an information society service explicitly requested by the subscriber or user" still require banners under the ePrivacy Directive [1]. Ex: if you're counting unique visitors with a first-party cookie, you need to gather consent.

[1] https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CEL...

(Not a lawyer)


Browsers do, it's the do-not-track header. It's on by default, as it should be. Websites just refuse to honour the header.

Not all, websites, though; I believe medium, of all websites, will actually not embed some content if you sent it a DNT header. Not sure if they still do that, though, because their UX for readers has become absolute trash.


The do-not-track header is just another bit for fingerprinting you, I don't believe any ad-company actually honors it. Also, why trust that they do, when there's a solution that doesn't need trust?


I want the browser to not let any other party get more bits of entropy than I agree to. My ip is a few bits of entropy. Now I want my browser to give not-that-many-more bits of entropy to any remote server. If it allows a remote server to list my system fonts, render something on a canvas and read back the bytes,or do some audio mixing on my machine and read back low level results, then my browser has failed me. I want it to say "I'm not showing this webpage at all because it tried to read back a canvas".


> it's the do-not-track header. It's on by default

The DNT header isn't on by default in any major browser.

(Additionally, the spec was abandoned for a bunch of reasons including not being able to agree what constitutes tracking)


Sounds simple. Could you elaborate? Among all of the problems that this legislation aims to solve, what problems can be solved by simply blocking third-party cookies? And what can not?


Tracking can not be solved by this, today many ad companies get subdomains on the websites they track on, so they are technically not "third-party"


That kind of technical countermeasure only works when you're a statistical minority and adtech doesn't care enough to chase after you. If everyone were to block 3p cookies, the adversary would create new ways to share data on the backend, without clientside involvement, and we'd be right back where we started (other a small increase in friction).


The cookie-banner simply means that there's no enough competitive advantage in improved UX over tracking the user.

We don't see many websites who opt out out of the "track the users all across the web" scheme in order to remove the cookie banners altogether.

On the other hand, thanks to the banner everyone has become aware that the are being tracked. This is good because it brings people into the discussion, so that when EU says "stop tracking" people are not puzzled about what tracking those Eurocrats are talking about. How people are supposed to know if they should support the actions of their government if they don't know what's happening behind the scenes?


Would be better to implement it in the browser; similar dialogs to the "X site wants to access the webcam", right?

Cookies are entirely on the client side anyway: trusting every website to do the right thing is obviously not going to work.


Funny thing is that Internet Explorer used to have these banners. But users started disabling them and accepting the cookies when they got too annoying.


That wouldn't accomplish anything, as cookie banners have to do with tracking and not inherently with third-party cookies. Tracking via first-party cookies is still illegal and would require consent.


This would work if cookies was the only way to track people. There is also localStorage, ETag (and other cache-oriendted methods), fingerprinting, owning a browser, etc.

What we need is a low that forces websites to obey the "do not track" header.


It's a terrible idea. It will break a lot of sites in a way that's not predictable.


What's an example of a site that needs third-party cookies to work? If it breaks because it can't load Google Analytics, that's a website bug.


Authentication on www.example.com from auth.example.net, would be a common issue, for one.

Edit: fixed the domain to actually make the point I was trying to make.


The root domain example.com is still the same here.


Heh. You’re absolutely right, I misspelt the domain. Point being, sometimes you have different domains that belong to the same entity and you need to bridge them; this seems strange to small companies, but happens quite often in enterprise.


Blizzard's Battle.net does, or at least did at one point.

In 2020 my friend couldn't add a new credit card to his account because browsers updated their same-site cookie behavior.

They were setting their JSESSIONID cookie wrong when doing oauth behind-the-scenes which caused a nice 302 redirect loop. For whatever reason the API calls required both *.battle.net and account.blizzard.com.


Microsoft Teams and some Atlassian products also fail when third-party cookies are disabled.

You have to activate them for the login but you can deactivate afterwards.


Until early this year, anything by Atlassian broke without third-party cookies because of how they did their SSO.

They finally fixed it this year. Made it impossible for me to login on my Firefox browser.

Sony Playstation website also broke until like three years ago with third-party cookies disabled.


sso, I think. you cannot authenticate with a Google, FB, or Apple account.


That's not how that works. One common way here is OAuth2 which includes a callback URL such as:

https://internal.yourcompany.com/oauth2/callback?token…

That token in the callback does not require any kind of cookie to use for subsequent authenticated calls.


"Sign In with Google" works with third party cookies disabled.


I've had third party cookies blocked for a year, and the number of sites that "broke" can be counted on a single hand.


Now that Safari blocks third-party cookies by default, most sites have adapted.


Websites will get updated pretty damn fast if everyone has them disabled.


Wait, it's not about game development??


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: