Enter your email. We will reset your password and send you a new one.
Last 5 email alerts sent for docker on Hacker News
I just got one. I’m blown away by the speed as well. Chrome runs insanely fast! Alas, it’s not developer ready yet. Brew is a mess. Docker doesn’t work. PyCharm is WIP although can use x86 version. I was skeptical of the hype but this little laptop has made me realize how slow everything else is. Unfortunately, while the hardware has accelerated far beyond expectations, the software - specifically MacOS BigSur is a major step backward. So many fucking animations. Everything feels fluid like operating in molasses. The UI changes seem to be shoe horned into a desktop that doesn’t need giant white space for fat fingers. Menu bars are twice as tall taking up precious space. I want to ask Apple’s UI team: WHY!? What the fuck is currently wrong with macOS Catalina UI? Until you can satisfactorily answer that, there shouldn’t be any change. Stop changing the UI like you’re working at Hermès. It’s not fashion.
by neilpanchal 2020-11-24 22:21:55 | link | parent | submission
I had issues so many times with Scaleway where it took insanely long to reboot a server, seemingly getting stuck on some network provisioning, so I ended up ditching them altogether. Might use these boxes for some testing or throwaway stuff though. Currently happy with DO and Hetzner, and increasingly so with the latter. But Hetzner's lack of firewall is a bit annoying, esp. when using Docker (which interferes with iptables).
by spurgu 2020-11-24 20:41:51 | link | parent | submission
I'm not making an assertion about whether that's what I would do if I were in charge, I'm making an assertion about what how Apple sees the difference between the iPad and the Mac. :) And sure, it runs counter to what a lot of folks seem to think (and definitely what a lot of folks want). But it's genuinely the way I read the tea leaves. The iPad Pro with the Magic Keyboard arguably is an "almost laptop solution" already -- what I'm saying is that I don't think Apple has any desire to give the iPad a Unix shell, a fully open file system, the ability to sideload apps, and so on. If you want to run Microsoft Office or Photoshop on your iPad, they think that's great, because there are iPadOS versions of it that fit within their vision of What iPads Are. If you want to run a Docker container with a local web server and Visual Studio Code on your iPad, though, I don't see that fitting in.
by chipotle_coyote 2020-11-24 18:32:24 | link | parent | submission
Last 5 email alerts sent for Kubernetes on Hacker News
You must be a masochist.. I'd rather spend 1/10th the time finding out if I have a hit on my hands and address scaling problems later than battling with kubernetes and kafka. Build product not infrastructure.
by cwackerfuss 2020-11-24 19:12:40 | link | parent | submission
looks like you don't really understand the power of aws/serverless/Kubernetes/kafka/cassandra/graphql/react. when your app becomes instant hit after featuring in the frontpage of HN, you will have 1 billion daily active users and your business goes into flames because of new workload.
by mkrishnan 2020-11-24 19:04:26 | link | parent | submission
After a year of writing, my book Distributed Systems with Node.js was finally published today! The book is available here: Paperback from O'Reilly (use code DSWN20 for 20% off):
https://shop.aer.io/oreilly/p/distributed-systems-with/9781492077299-9149 Paperback from Amazon:
https://www.amazon.com/Distributed-Systems-Node-js-Building-Enterprise-Ready/dp/1492077291 Kindle from Amazon:
https://www.amazon.com/Distributed-Systems-Node-js-Building-Enterprise-Ready-ebook/dp/B08MTJ4H6L My intent with this book is to prove that Node.js is just as capable as traditional enterprise platforms (like Java or .NET) for building services that are observable, scalable, and resilient. It's very hands-on and readers will write application code and integrate it with tooling from various layers of a modern service stack. For example, readers will encapsulate two applications using Docker, run a private container registry to host images, deploy to Kubernetes, configure health checks, scale the instance count, and allow the apps to communicate with each other. Readers will send logs to Elasticsearch, build a dashboard with Kibana, send metrics to StatsD/Graphite, build another dashboard with Graphite, transmit request spans and visualize request hierarchies with Zipkin. They'll even configure a CI pipeline, run unit tests, enforce code coverage, and deploy to a production server. And that isn't even half of it! I wrote this book as a Node.js developer who loves writing code that runs on the server but who doesn't have much interest in frontend development. Node.js developers often get stereotyped into being frontend developers and I hope that this book can help bring an end to that. Let me know if you have any questions!
by tlhunter 2020-11-24 17:47:20 | comments
Last 5 email alerts sent for aws on Hacker News
I 'm not against to use Rust or any other programming language, open source project or technology. I mean, the companies love open source projects because they can produce their self flavor as a product (search engine, linux distro, containers and orchestration technologies, etc.) when they wish that, without worries about uncomfortable dependencies. AWS loves Rust in the same way than Facebook loved PHP.
by germansm 2020-11-24 20:38:46 | link | parent | submission
Probably an unpopular opinion but I am not very excited by this. Getting funding and support is great, but the needs of the many are very different than the needs of AWS and Facebook. In my opinion we see this a lot in Linux as well.
What Google wants has almost nothing to do with what I need.
(I said almost). I think it should gestate a while in a less FAANGy environment.
by ThinkBeat 2020-11-24 20:18:30 | link | parent | submission
I work with AWS quite regularly and this is one of reasons I don't play with Rust more. If we had an "official" Rust AWS SDK I would probably be building things with it.
by Corrado 2020-11-24 20:12:13 | link | parent | submission
You're trivializing the value they provide. If mitigating large scale DDoS attacks was as simple as adding a few iptables rules they would have solved this weeks before paying AWS $3,000 a month for what is effectively an insurance policy. The reality is that AWS has terabits of bandwidth and can mitigate these attacks upstream from your servers, which have bandwidth measured in gbps, not tbps. So, unless you have a global network the size and scale of Amazon or Cloudfront, no, you can't just mitigate these attacks with a few firewall rules.
by illumin8 2020-11-24 19:56:51 | link | parent | submission
Last 5 email alerts sent for coreos on Hacker News
Yeah, YC is funding lots of open source projects. PostHog was in our batch. GitLab, Docker, Mattermost, and CoreOS come to mind as other open source YC companies. There are a number of businesses we could build around the project. A cloud service or enterprise products/support are the obvious ones. Right now, we're focused on community building, because a potential open source business can't be successful with a healthy open source project.  https://news.ycombinator.com/item?id=22376732
by bfirsh 2020-11-19 22:07:09 | link | parent | submission
The underlying technology (read-only btrfs system snapshots) is actually much older than CoreOS (in SLE it was introduced somewhere in SLE 11 IIRC) and the idea to use it in a similar way as this is almost as old. It has multiple advantages over rpm-ostree's approach: - Much faster (no need to manage thousands of hardlinks) - Arbitrary modifications can be done easily in a chroot - Not only /usr is snapshotted - No need for layering and rebases (rpm-ostree has a base image + rpms on top, which this doesn't need) - RPMs just work as-is, there are no hacks with %post scripts and so on
by Vogtinator 2020-11-15 16:24:37 | link | parent | submission
Last 5 email alerts sent for python on Hacker News
Yea, I've been using VSCode these past couple of weeks as I'm now coding in Python + JS after a few years of only JS with WebStorm... and I'm buying a PyCharm license tomorrow. VSCode is fantastic in many ways, and I'll keep using for my markdown dev notes and for general purpose programming occasionally, but for anything of substance I'm a JetBrains convert.
by arcturus17 2020-11-24 23:55:32 | link | parent | submission
by 1vuio0pswjnm7 2020-11-25 01:02:46 | link | parent | submission
by fishtoaster 2020-11-25 00:10:32 | link | parent | submission
by wvenable 2020-11-25 00:04:48 | link | parent | submission
by egypturnash 2020-11-24 23:14:05 | link | parent | submission
Last 5 email alerts sent for machine learning on Hacker News
There have traditionally been different approaches and definitions for AI. Some emphasize behaviour while others emphasize the logic behind the behaviour. (In some sense, while expert systems of course were an attempt at getting practical results, they might also have been an attempt to implement what was seen as human reasoning, while e.g. black box machine learning could be more about just getting the behaviour we want.) Some approaches view agents as intelligent if their action resembles humans or other beings that we consider intelligent, while other approaches are merely interested in whether they perform well at a specified task, perhaps more so than humans. So yes, "any solution that imitates intelligent behaviour" is probably right, but with nuances with regard to what that actually means.
by Delk 2020-11-25 01:15:29 | link | parent | submission
> Machine learning is a set of techniques developed to attack modeling problems that traditional algorithms couldn't solve. So if the algorithm was developed and used before computers it definitely isn't machine learning. How does your evidence support your claim?
by RyanGoosling 2020-11-25 00:11:26 | link | parent | submission
I know very little, so perhaps someone could enlighten me. But I am curious how Apple Silicon will be for machine learning. When Apple releases a MacBook Pro with 64GB of unified memory (assuming they will) — won’t that be amazing for machine learning? I am under the impression that GPU memory is a huge factor in performance. Also, is there any way that the neural engine can accelerate training — or is it just for executing trained models faster?
by liamcardenas 2020-11-24 23:55:06 | link | parent | submission
The progress bar for software updates and file transfers on macOS are infuriating. They’re decidedly unhelpful, and more often drastically wrong than anything I could imagine implementing myself. So wrong that it seems some significant improvements would be really low hanging fruit, if they cared to try: - Doing anything with many smaller files will perform worse than fewer, larger files. Factor in a size/quantity multiplier/ratio on initial estimate. For APFS, I think the information needed should be available from init. For other file systems, this information should be gathered in a background process and feed the estimate as it accumulates. - Revise the estimate more frequently, but not so frequently it produces pink noise. Apple’s progress bars notoriously get stuck for long periods, even hours. This isn’t just a bad user experience, it’s actually dangerous. Users frequently interrupt long running processes they perceive to be stalled, even if progress is ongoing but merely hard to observe. Swinging too far in the other direction at least gives an indication that something is happening, but the values become meaningless (I don’t know if this is still the case on Windows but I distinctly remember laughing uncontrollably watching estimates rapidly vacillate between seconds and hours, while overall progress appeared to have a steady pace). - Revise the estimate based on expectations versus reality. As in, if your predetermined formula says 1000 small file operations take 15 seconds, but a comparable workload takes 30... and vice versa for large file operations... measure that! You don’t need machine learning for this, just a sufficiently proportionate time window to do some basic arithmetic. - Provide additional observable feedback when these adjustments take place, and when there’s nothing to report other than something is taking a long time to generate more meaningful feedback. This doesn’t have to be ugly or noisy, it can be as simple as expanding the progress bar or messaging that progress is really ongoing even if you have no more information to report. - Give us nerds a log! Please for the love of all that’s good in the world, let us see the sausage go through the grinder, even if we need to know a magic incantation to conjure it. And supplement it with a heartbeat message during long runs so we also don’t fall prey to dangerously assuming the process has stalled. - I know Apple is notoriously minimalist in its UX style (most of the time), but maybe more information is sometimes actually less. As an example, major OS updates often quietly include multiple stages like firmware updates that require a reboot or temporarily disabling the display. These can be very alarming to users who don’t know what happened! Again, not just bad UX, it can be misleading and users may assume they’ve experienced failure or even an update loop that won’t terminate. A small amount of communication to set expectations could go a long way to alleviate that. A simple list of the overall steps and a bit of warning before a major visual disruption to observable progress would significantly reduce the chaotic list of assumptions users might be imagining. - Maybe these updates are just too big, and doing too much at once. I mean, I understand that a major version update is going to have bigger subsystem impact, but maybe some of the ancillary stuff can be deferred. As an example, a major version update could get enough of the underlying update in place to get the machine in a usable state, then prioritize user land updates based on demand and run the rest in a background process. - If you just don’t know, maybe it’s better to just be more vague. Apple popularized the spinner where progress is indeterminate. Use it as a last resort!
by eyelidlessness 2020-11-24 22:33:09 | link | parent | submission
Last 5 email alerts sent for ruby on Hacker News
Annotations? Beans? Generators? @Entity @Type @JsonProperty Maybe any technology seems magical until you understand it. I see at least as much magic in Java as Ruby now.
by ericb 2020-11-25 02:52:18 | link | parent | submission
I'm a few days late, so I don't know if you'll see this. When adding "people cards", I do anki's forward-and-reverse card, and I build the card so I can read both sides, and generate the opposite side. I'm just trying to build an association in my brain for their name, so I might do: Front: "people: Neighbors to west, older couple, he likes woodworking" Back: "people: Joe and Trish" I only use a single Anki deck for all my cards, which span many interests, so I'll usually give myself a word to specify the topic, like "Go, Ruby, People, Mental Models" Or if I can grab a headshot or picture from Slack, Twitter, or LinkedIn, I'll often put the picture on the front, name on the back. When I review the card "from the back", I try to remember what the person looks like. I have aphantasia  so this form of recall is effortful.  https://www.facebook.com/notes/blake-ross/aphantasia-how-it-...
by wonder_er 2020-11-25 01:55:17 | link | parent | submission
Oh my god, the flashbacks! Circa 2005 doing client work during summers off of uni in this newfangled “Ruby on Rails” (v0.8.x). I was new to all of it. Coming from Flash/Flex/AS and jumping both feet into Ruby, Rails, JS via Script.aculo.us! Great times!
by kjsthree 2020-11-25 00:50:09 | link | parent | submission
Didn’t you mean that the other way around? If Ruby ecosystem became what current JS ecosystem is now, it wouldn’t be Ruby anymore, at least the way people who love it love it. I’m not quite sure that it would benefit from dispersion of focus on itself onto flavour of the <insert your noun here> every <relatively frequent period of time>. It’s such a good language to write stuff in not least because indeed it keeps trying to move along the axis of that aspect of it.
by j_crick 2020-11-24 23:40:28 | link | parent | submission
Last 5 email alerts sent for bitcoin on Hacker News
CME offers Bitcoin futures, available for trading via Globex. It's a futures contract, so you won't hold any BTC position (long or short), and it's actually cash-settled, so you don't have to worry about delivering or actual BTC. However, margin will be steep, considering the volatility; my broker in particular actually has a $200k overnight margin requirement for a short position. I'm sure you've heard the saying that the market will remain irrational for longer than you can remain solvent. Please keep that in mind when shorting bubbles.  https://www.cmegroup.com/trading/equity-index/us-index/bitco...
by hpkuarg 2020-11-25 03:25:49 | link | parent | submission
Do you consider the energy spent mining gold a waste? This electricity is necessary to mine bitcoin. It is the same principle. If you have a more efficient way to mine bitcoin, just do it, you will be rich, or just tell everyone else and everybody will use it.
by mercenario 2020-11-25 02:57:16 | link | parent | submission
It's pretty simple, if it costs you 1 dollar to mine a bitcoin and someone (the market) wants to buy that bitcoin for 2 dollars, that cost is worth it.
It doesn't matter what that cost is, if it's 1 dollar in energy or 1 dollar in wages or whatever, it costs $1 and you sell by $2, you profit, the cost was worth it.
by mercenario 2020-11-25 02:53:23 | link | parent | submission
What's the best way to take a bearish position on Bitcoin? I looked into it during the price surge a few years ago, but couldn't find anything that didn't involve setting up non-USA accounts. Any secure ways to do this for US citizens?
by jb775 2020-11-25 02:40:36 | link | parent | submission
Nope. But when the volume and volatility is high like it is right now, it can be profitable to be a liquidity provider instead. It's actually very easy to provide liquidity to Uniswap and other decentralized such as Curve.fi - both have bitcoin-paired pools.
by bouncycastle 2020-11-25 02:03:54 | link | parent | submission
Last 5 email alerts sent for ios on Hacker News
That explains a iOS vs Android difference (ARC vs Garbage Collection), but it doesn't explain the article (and Gruber's) apparent argument that Apple Silicon machines running native ObjectiveC/Swift code use less memory than the same apps natively built via ObjectiveC/Swift code on Intel running the same OS (but different machine code obviously).
by berkut 2020-11-25 03:58:37 | link | parent | submission
Have you tried doing Android dev using an Android emulator on an 8 GB Mac? Perhaps in combination with Xamarin? I did and performance was horrible. For just iOS or macOS dev, 8 GB could be fine though.
by wsc981 2020-11-25 03:20:00 | link | parent | submission
Um, this is a feature not a bug. If you're syncing with an iOS device in Finder, well cancel that first . Lots of applications can prevent rebooting to avoid data loss. Again, this is a GOOD thing. Sheesh.
by crazygringo 2020-11-25 03:15:56 | link | parent | submission
This is actually how I feel about modern iOS too. Why is it a swipe there, a modal there...none of it has a rhyme or reason that makes sense to me. Ok, to get what I want here, do I need to force push, swipe, or tap some icon somewhere? There’s no consistency, everything is hidden, and each little thing to use the OS better is a “trick.” On my personal iPhone, you swipe from the bottom to get Control Center, and on my work iPhone, you swipe from the top corner. If you want to send the output of one program to the input of another, you click share, move past a bunch of Contacts and Airdrop etc that I don’t think I’ve ever used in this way, swipe left/right to find an app, don’t find it, and either need to click an “Add” or “Other” button OR you need to swipe further down to click More..., until you can select thing you actually want. I have so many gripes with the whole system. I almost believe that they are trying to make the thing harder to use so that they can create a dark-pattern around feeling a sense of mastery...but when I was trying to walk an elderly relative through the menus over the phone, it became especially obvious just much specialized knowledge the iPhone requires in order to do the very most basic of things, and almost NONE of it is discoverable.
by IggleSniggle 2020-11-25 02:45:15 | link | parent | submission