Blazor is not equal to JavaScript. Blazor is Microsoft's answer to React and Flutter. So far of the three, I like Blazor, but not the server version that uses SignalR to communicate with the client because it locks you in. Blazor Webassembly is awesome, but just needs a good DOM manipulation library. With Blazor Wasm on the client, I can use anything on the back end.
The question is very unrealistic... how often do you know there is exactly one duplicate in a list?
Much more common is there are zero or more duplicates.
Here is what I came up with:
- We need a way to record what has been already seen
- A hash-map could work, but I think we can be more efficient
- We know that all elements are less than the array length in size...
- So we can allocate a single array of flags with the same length as the input
- The flag for value `n` is at position `n` in this array
let findDuplicates xs =
let seen = Array.create (Seq.length xs) false
let mutable duplicates = []
for x in xs do
if seen[x] then
duplicates <- x :: duplicates
else
seen[x] <- true
duplicates
I'm pretty sure this is O(n)?
I think it's interesting that the mathematical trick (sum of numbers 1 to n formula) does not work in the more realistic variant. This fact is probably why leet-code problems are so disconnected from the real world. It's like AI for board-games.
The question isn't really about finding duplicates in a list, but a slightly backhanded way to ask "do you remember high school math and can you apply it". A less artificial problem would just make things more complicated.
Also, your way is O(n) memory when O(1) would be enough for the original question.
Always start a programming interview by asking questions. You know there is exactly one duplicate in the list because you wouldn’t start coding without thinking about what the problem left unspecified and getting the interviewer to clarify.
If some state information is stored using metadata then we need that metadata to know the total state. The contents of TF state is more than what can be read from the provider APIs often.
On other hand it also avoids Tesla when it inevitably crashes to same ballpark as other automotive companies... No I really believe it is nothing special and will eventually come down.
It's not that GraphQL enables anything that was not possible before; it's that GraphQL provides a bit more structure and standardization around these things. If you plan to do this stuff, why not follow something with a spec and various bits of tooling rather than doing it ad-hoc?
Yes, this is true, I'm mostly worried about the elliptic curve signature part, as everything else could be fixed with an emergency hard fork (except SHA256).
Sadly OP_CAT operation is disabled (or substring equality operation), which would make lamport signatures available again for high value transactions. I would love it if lamport signatures would be enabled again (it would be quite easy to do), but I'm afraid that there isn't enough concensus to do it at this point, because some people would think that it's wasteful, and also lamport signatures are dangerous, as they can be used only once.
Compiler will get plenty complicated without IDE scenarios, trust me on that one. Slowness is also never really a thing to worry about here, especially because usage patterns in an IDE vs. a batch process are so different. It's almost always the other way around: someone writes something that's completely fine for a batch process but tanks IDE performance.
Here's an idea for the EU: mandate that all major browsers ship with third-party cookies disabled by default and drop the whole cookie-banner nonsense.
Other idea: make browsers have a proper cookie banner and not one that tricks me into selling my soul, I never got why pages would need individual banners.
Evil sites will use localStorage or some third party API and continue tracking you.
I am sure people here will find at least 20 solutions on the problem on "how can a group of evil websites track a user across if cookies do not work but JS is On", the solution would involve something like drop this lines in your html page and the js code there will connect to some server and store some fingerprint there, Google might decide to give your browser a fingerprint to help with their ad business.
I wouldn't approach this problem from a technical direction.
If there is a browser based vendor agnostic opt-in popup for user tracking (not only cookies) you can outlaw and severely punish attempts to circumvent that.
Given the time and resources courts really dislike the "welllll technically..." Argument.
I know, I tried(and probably failed) to explain to OP why his simple idea to "just make the browsers disable third pary cookies" or other technical solutions are not going to work, you need GDPR like laws to focus on the actual problem and not some technical implementation because developers will find workarounds for technical only stuff.
Browsers could help by implementing a standard GDPR popup for this shitty websites to share , at least it will not be same dark pattern UX, broken implementation shit this sites use today.
Browsers could do a lot of good things if they would focus on the actual users needs and not on what some developer feels cool to work on or what soem giant company ants to implement next.
Gotcha. FYI, the post you replied to suggested "make browsers have a proper cookie banner", which seems like you agree with? The one that says "disable third party cookies" was two levels above your post.
>Gotcha. FYI, the post you replied to suggested "make browsers have a proper cookie banner", which seems like you agree with? The one that says "disable third party cookies" was two levels above your post.
Ah, sorry ,I messed up. I am trying to force myself to always quote the text I am replying, sometimes I do not do it and is causing issues, I will try to do better.
It would be nice to have browser support for cookie popup that is uniform and not worded differently everywhere. Maybe even a default setting and ability to auto-reject. The popups have ruined the experience.
> Maybe even a default setting and ability to auto-reject.
We sort of have auto-reject, with the Do Not Track header. Which pretty much everyone has decided to ignore, because then people just say no and that's not the result they want.
> The popups have ruined the experience.
And behind every popup is a company that decided that ruining your experience was the correct thing to do.
The problem is that both the client and the server is controlled in large part by Google and they like to optimize the user experience into whatever allows them to sell ads.
Lynx is about the only browser that still notifies you and has you accept each cookie manually.
As long as the most widely used browser is owned by Google? No way that could possibly end up being intentionally broken and misleading. The law would have to specify the exact shape of the cookie dialogue down to the pixel and I still would expect Google to find a way to fuck it up.
> The law would have to specify the exact shape of the cookie dialogue down to the pixel and I still would expect Google to find a way to fuck it up.
In the EU it's more usual for judges to take the "spirit of the law" into account for rulings rather than the "letter of the law" that is more common in Common Law systems.
I don't know enough and IANAL to state that with sureness about the whole legal system of all EU countries but it's a rule-of-thumb, the law doesn't need to be absurdly specific to avoid loopholes, it just needs to be good enough to cover ground for judges to judge if the accused is following its spirit.
> Sure, then we change the law again and/or sue Google.
Which generally seems to have an almost 10 year delay for every iteration since Google will appeal on every instance and do its best to slow down every curt issued request heading its way to the fullest amount possible. The result: Not happening in the next century or two.
That is... doubling down on a bad idea. Moving the stupid cookie banners to the browser itself so we can not block them. It's so idiotic, the EU bureaucrats will probably consider it.
I think it's the other way around, if the cookie banners were implemented at the browser level, there would be "auto-reject" extensions on day 0. Or, worst case, auto-rejecting forks of Chromium and Firefox.
Cookies, including first-party ones, that are not "strictly necessary in order to provide an information society service explicitly requested by the subscriber or user" still require banners under the ePrivacy Directive [1]. Ex: if you're counting unique visitors with a first-party cookie, you need to gather consent.
Browsers do, it's the do-not-track header. It's on by default, as it should be. Websites just refuse to honour the header.
Not all, websites, though; I believe medium, of all websites, will actually not embed some content if you sent it a DNT header. Not sure if they still do that, though, because their UX for readers has become absolute trash.
The do-not-track header is just another bit for fingerprinting you, I don't believe any ad-company actually honors it. Also, why trust that they do, when there's a solution that doesn't need trust?
I want the browser to not let any other party get more bits of entropy than I agree to. My ip is a few bits of entropy. Now I want my browser to give not-that-many-more bits of entropy to any remote server. If it allows a remote server to list my system fonts, render something on a canvas and read back the bytes,or do some audio mixing on my machine and read back low level results, then my browser has failed me. I want it to say "I'm not showing this webpage at all because it tried to read back a canvas".
Sounds simple. Could you elaborate? Among all of the problems that this legislation aims to solve, what problems can be solved by simply blocking third-party cookies? And what can not?
That kind of technical countermeasure only works when you're a statistical minority and adtech doesn't care enough to chase after you. If everyone were to block 3p cookies, the adversary would create new ways to share data on the backend, without clientside involvement, and we'd be right back where we started (other a small increase in friction).
The cookie-banner simply means that there's no enough competitive advantage in improved UX over tracking the user.
We don't see many websites who opt out out of the "track the users all across the web" scheme in order to remove the cookie banners altogether.
On the other hand, thanks to the banner everyone has become aware that the are being tracked. This is good because it brings people into the discussion, so that when EU says "stop tracking" people are not puzzled about what tracking those Eurocrats are talking about. How people are supposed to know if they should support the actions of their government if they don't know what's happening behind the scenes?
Funny thing is that Internet Explorer used to have these banners. But users started disabling them and accepting the cookies when they got too annoying.
That wouldn't accomplish anything, as cookie banners have to do with tracking and not inherently with third-party cookies. Tracking via first-party cookies is still illegal and would require consent.
This would work if cookies was the only way to track people. There is also localStorage, ETag (and other cache-oriendted methods), fingerprinting, owning a browser, etc.
What we need is a low that forces websites to obey the "do not track" header.
Heh. You’re absolutely right, I misspelt the domain. Point being, sometimes you have different domains that belong to the same entity and you need to bridge them; this seems strange to small companies, but happens quite often in enterprise.
Blizzard's Battle.net does, or at least did at one point.
In 2020 my friend couldn't add a new credit card to his account because browsers updated their same-site cookie behavior.
They were setting their JSESSIONID cookie wrong when doing oauth behind-the-scenes which caused a nice 302 redirect loop.
For whatever reason the API calls required both *.battle.net and account.blizzard.com.