More

Hash of Feature Layer Graphics

Hash of Feature Layer Graphics


On a web application I'm currently working on, some of the data for the GIS Service is bad. For instance, there may be multiple polygons that should be a singular graphic, but instead they are each their own individual graphic. So what I'm doing right now is creating a hash call GraphicHash which is keyed by the name of each graphic in the attribute.

From there I get each graphic and do a union on their extents.

var GraphicHash = {}; //hash for all the graphics var ctr = 0; //counter for collisions var zoom_change = false; // generate the hash keyed on REC_AREA_NAME, if collision, then add a number to the end of it recLayer.on("graphic-add", function(evt) { if (zoom_change == false) { var cur_graphic = toIDCase(evt.graphic.attributes.REC_AREA_NAME); if (GraphicHash[cur_graphic] == null) { GraphicHash[cur_graphic] = evt.graphic; } else if (GraphicHash[cur_graphic].attributes.STREET_ADDRESS != evt.graphic.attributes.STREET_ADDRESS) { GraphicHash[cur_graphic + ctr]; ctr++; } else { var new_extent = GraphicHash[cur_graphic].geometry.getExtent().union(evt.graphic.geometry.getExtent()); GraphicHash[cur_graphic].geometry.getExtent().update(new_extent); } } });

However, after I have created that hash, I would like to go through each graphic, calculate the centroid and then display it on the map. Currently I am doing the following:

for (var graphic in GraphicHash) { var centroid = GraphicHash[graphic].geometry.getCentroid(); GraphicHash[graphic].setGeometry(centroid); GraphicHash[graphic].setSymbol(recSymbol); //graphic_array.push(GraphicHash[graphic]); recLayer.add(GraphicHash[graphic]); }

I do this on the "update-end" event. However, in some instances I get errors where getCentroid() is not a function and other times I get problems where a point exists for one of the hashed values, but it seems to have not included other polygons that needed to be hashed, and those polygons that weren't hashed appear on the map as polygons.


The graphic looks like it was already added to the layer, so resetting a geometry and symbol should work.

You will not be able to re-add the same graphic.

Instead you should create a new instance of the graphic in the for loop.

var g = new Graphic(centroid, recSymbol); recLayer.add(g);

Performance Management and Quality Improvement

Healthy People External
Healthy People provides science-based, 10-year national objectives for improving the health of all Americans. For three decades, Healthy People has established benchmarks and monitored progress over time in order to encourage collaborations across sectors, guide individuals toward making informed health decisions, and measure the impact of prevention activities.

County Health Rankings and Roadmaps External
The annual rankings measure vital health factors, including high school graduation rates, obesity, smoking, unemployment, access to healthy foods, the quality of air and water, income inequality, and teen births in nearly every county in America. They provide a snapshot of how health is influenced by where we live, learn, work, and play.

Mobilizing Action Toward Community Health (MATCH) External
Provides communities with reliable, comprehensive information about their present health status in the form of the nation&rsquos first-ever County Health Rankings, which will be updated each year.

Community Commons External
Community Commons is an interactive mapping, networking, and learning platform for the broad-based healthy, sustainable, and livable communities&rsquo movement. It offers a web-based application that helps users interactively visualize data through Geographic Information Systems (GIS).

STLT Collaboration Space Field Notes

Geographic Information Systems (M. Eng.)

Those who are familiar with effective data processing using cutting edge information systems can choose from many different professional possibilities in key positions in our society. This applies especially for the future, as the possibilities for generating and applying data are far from being fully exhausted. At the same time, the demand for well-trained experts in the public and private sectors is increasing. Regardless of the industry in which you work - if you are interested in including GIS in your working environment, you can deepen your knowledge, skills, and methods in this career-integrated master’s program in Geographic Information Systems. Graduates are familiar with the most recent insights in the field and use scientific methods in an interdisciplinary manner. As is the case with all degree programs at Anhalt University, the continuing education program is characterized by its strong practical orientation: Problems from daily working life are recognized and specific solutions are developed for these.

Requirements

A qualified university degree in a bachelor’s or Diplom degree program with a standard period of study of at least seven semesters (six semesters are possible if additional modules are taken) as well as qualified professional experience related to that, typically lasting at least one year. Admission is granted using a selection process.

Technical questions about the course

Organizational questions

General Questions

Notes
Regulations and Conditions
Information Flyer
Links to Institute of Geographic Information and Surveying

Die Investition in einen modern ausgestatteten Laptop (Preiskategorie 800 - 1.000 Euro) erleichtert Ihnen die Studienabläufe, da bspw. die Moodle-Lernplattform nur über das Internet erreichbar ist. Als Software-Grundausstattung benötigen Sie neben einem aktuellen Windows-Betriebssystem einen Web-Browser sowie gängige Office-Software (Microsoft Office). Zusätzliche Fachsoftware wird als kostenfreie und vollwertige Studentenversion bereitgestellt, die Sie während des Studiums nutzen können.

Bildungsurlaub
In den meisten Bundesländern wird Arbeitnehmerinnen und Arbeitnehmern die Möglichkeit geboten, sich für eine bestimmte Zeit von ihrer Berufstätigkeit freistellen zu lassen, um Gelegenheiten zur Weiterbildung wahrzunehmen. Dies wird als Bildungsurlaub bzw. Bildungsfreistellung bezeichnet. Informieren Sie sich unter: http://www.iwwb.de.

Bildungsprämie
Mit der Bildungsprämie wird die berufliche Weiterbildung vom Staat gefördert. Sie setzt gezielt finanzielle Anreize, um die Weiterbildung bezahlbar zu machen und die individuellen Möglichkeiten im Beruf zu erweitern. Informieren Sie sich unter: http://www.bildungspraemie.info/

Bildungskreditprogramm
Durch das Bildungskreditprogramm wird ein zeitlich befristeter, zinsgünstiger Kredit zur Unterstützung von Studierenden angeboten. Informieren Sie sich unter: http://www.bmbf.de

Aufstiegsstipendium
Dieses Programm gibt besonders talentierten Frauen und Männern mit Berufserfahrung einen zusätzlichen finanziellen Anreiz zur Aufnahme eines Hochschulstudiums. Die Aufstiegsstipendien richten sich insbesondere an diejenigen, die ihren Hochschulzugang durch Ausbildung, Fortbildung oder Berufspraxis erworben haben. Weitere Informationen erhalten Sie unter: http://www.bmbf.de/aufstiegsstipendium

Begabtenförderungswerke
Begabte Studierende können ein Stipendium eines der vom BMBF unterstützten Begabtenförderungswerke erhalten. Voraussetzung für die Aufnahme ist neben überdurchschnittlichen Leistungen auch gesellschaftliches Engagement. Weitere Informationen unter: http://www.stipendiumplus.de/

Alternative Förder- bzw. Kreditmöglichkeiten
Möglicherweise könnten auch alternative Förder- bzw. Kreditmöglichkeiten in Frage kommen. Eine entsprechende Übersicht zur Prüfung verschiedener Studienkredite finden Sie unter: http://www.che.de/downloads/CHE_AP_224_Studienkredit_Test_2019.pdf

Stipendiendatenbank
Eine übersichtliche Stipendiendatenbank, die auch vielfältige Angebote anderer Stiftungen enthält, finden Sie unter: http://www.stipendienlotse.de/.

Steuerliche Auswirkungen
Aufwendungen, die aus der Inanspruchnahme eines Darlehens entstehen sowie Lehrgangs-, Prüfungs- und Unterbringungskosten, können beim Lohnsteuerjahresausgleich oder bei der Einkommensteuererklärung als Werbungskosten oder Sonderausgaben abgesetzt werden.

Der Studiengang richtet sich als weiterbildender Studiengang an Berufstätige, die im engeren oder weiteren Umfeld von Geoinformationssystemen arbeiten. Voraussetzungen zur Aufnahme des Studiums sind der Studienabschluss eines mindestens sechssemestrigen Studiums (Bachelor mind.) sowie ein Jahr Berufstätigkeit.

Im zweiten Semester des Masterstudiengangs Geoinformationssysteme ist eine Sommerschule zu absolvieren. Die Teilnehmer erhalten praxisrelevante Aufgabenstellungen, die entsprechend fachlicher Vorgaben im Team zu bearbeiten sind. Wir empfehlen dazu das GIS-Camp der Hochschule Anhalt, das seit einigen Jahren als interdisziplinäre Veranstaltung erfolgreich durchgeführt wird. Da alle Teilnehmer aus unterschiedlichen Arbeits- und Fachgebieten kommen, ergeben sich hierbei interessante fachübergreifende Betrachtungen sowie ein umfangreicher Erfahrungsaustausch. Fachlicher Höhepunkt ist jeweils das öffentliche Kolloquium zum Camp-Ende, an dem alle Teilnehmerinnen und Teilnehmer ihre Ergebnisse präsentieren werden.

Darüber hinaus dient das Camp auch dazu, sich in einer ungezwungenen Arbeitsatmosphäre und losgelöst vom üblichen Lern- und Berufsalltag kennen zu lernen und auszutauschen. Ausflüge, Streifzüge durch die Natur, Exkursionen und vielfältige sportliche Angebote ergänzen den wissenschaftlichen Tagesabschnitt.

Der Termin des GIS-Camps im Jahr 2020 steht fest und kann somit unkompliziert in die Urlaubsplanung der Familie integriert werden: Die Sommerschule findet von Mitte bis Ende Juli 2020 auf dem Campus in Dessau statt. Nähere Information dazu bitte unter [email protected] erfragen.

Hinter dem Begriff „Geoinformationssysteme“ verbirgt sich hier die erweiterte Bedeutung von Computersystemen, die mit Geodaten umgehen. Der Studiengang ist deshalb in diverse Teilmodule gegliedert, die durch Wahlpflichtmodule zur Spezialisierung auf bestimmte Themen, wie z. B. Geostatistik, Projektmanagement und vieles mehr sowie auf spezielle Anwendungsbereiche, wie z. B. Umwelt oder Kommune, ergänzt werden. Zu den Teilmodulen gehören:

  • Geoinformationssysteme (Grundlagen, Anwendungen, Modellierung, Analyse u. a.)
  • Datenbanken und Geodatenbanken
  • Geodaten (Datenquellen, Erfassungsverfahren, Formate u. a.)
  • Fernerkundung
  • Visualisierung und Kartographie
  • Geodateninfrastrukturen
  • Projektarbeit

Im berufsbegleitenden Studium werden mehr als 90 Prozent der Studieninhalte internetfähig aufbereitet und sind somit orts- und zeitunabhängig nutzbar. Wenige Präsenztermine vor Ort – an der Hochschule bzw. in einem regionalen Studienzentrum – sind aus organisatorischen und rechtlichen Gründen erforderlich. Die Präsenzphasen werden so knapp wie möglich gehalten. Sie dienen der Einführung in die E-Learning-Umgebung, in die jeweiligen Module des Semesters und der Arbeit mit speziellen Geräten oder Fachsoftware. Außerdem müssen in dieser Zeit auch Prüfungen abgelegt werden. Nicht zuletzt lernen Sie sich als Teilnehmer und auch Ihre Online-Tutoren persönlich kennen.

Für Projektaufgaben können Sie alternative Vorschläge, wie zum Beispiel die Verwendung eigener Daten oder angepasste Aufgabenstellungen, einreichen. Für die Master-Thesis empfehlen wir Ihnen, dass Sie sich Themen aus Ihrem eigenen Tätigkeitsumfeld suchen.

Wir planen bis zu zwei Präsenzphasen pro Semester im Umfang eines verlängerten Wochenendes (Freitag und Sonnabend). Zusätzlich ist noch eine Sommerschule im Umfang von zwei Wochen zu absolvieren.

Bei dem so genannten Blended Learning (Verknüpfung von Präsenzveranstaltungen und virtuellem Lernen auf der Basis neuer Informations- und Kommunikationsmedien), werden den Studierenden über eine Lernplattform im Internet Manuskripte und Aufgaben bereitgestellt sowie die Kommunikation mit den Kursbetreuern und den anderen Studierenden organisiert. Studierende können hier außerdem ihre fertiggestellten Aufgaben einreichen und mit den Lehrenden kommunizieren.

Wenn ein Kurs aktiv ist, sind die Kursbetreuer regelmäßig online erreichbar und beantworten alle Fragen, die über ein Forum oder per E-Mail gestellt werden. Auch die Praxisprojekte sind in Lerngruppen über das Netz bearbeitbar. Wir verstehen die Kommunikation zwischen Teilnehmern und Kursbetreuern, als auch den Teilnehmern untereinander, als ein wesentliches Element für den Lernerfolg.


AMI® TruE™ Trusted Environment Platform Security Solution

The AMI TruE™ Trusted Environment Platform Security Solution enables confidential computing that isolates sensitive data in an encrypted CPU enclave during processing, using Intel® Software Guard Extensions (Intel® SGX) and Intel® Security Libraries for Data Centers (Intel® SecL-DC) found in the latest Intel® Xeon™ Processors to enable a true trusted environment for confidential computing and secure cloud execution.

AMI TruE enables secure computing, easy to deploy workload attestation and secure application keys without compromising confidentiality or adding cost. It delivers a holistic, secure datacenter solution that is scalable, extensible and built for cloud-to-edge applications. It establishes and tracks the servers’ trusted compute status in the data center, complies with data sovereignty regulations, runs sensitive workloads on trusted servers and provides remediation measures for untrusted platforms.

AMI TruE helps datacenters secure platforms throughout the entire platform life cycle, by providing end-to-end firmware security and verification across the datacenter and integrating with other datacenter management and orchestration tools to provide a holistic view of platform firmware security for all servers in use. Supply chain attacks can be easily avoided by attesting the shipped firmware and software hash information of new platforms. After deployment, server trust validation continues to attest the integrity of the firmware and software running across the enterprise.


Remote Sensing Image Retrieval with Deep Features Encoding of Inception V4 and Largevis Dimensionality Reduction

Remote sensing image retrieval is an effective means to manage and share massive remote sensing image data. In this paper, a remote sensing image retrieval method has been proposed, which adopts Inception V4 as the backbone network to extract the deep features. To represent the low-level visual information of the remote sensing image, the feature maps generated from the first Reduction Block of Inception V4 through using 5 × 5 convolutional kernels are extracted and reorganized. Next, VLAD (Vector Locally Aggregated Descriptors) is exploited to encode the reorganized features to obtain a compact feature representation vector. The vector is cascaded with the features extracted from the fully connected layers to form the overall feature vector of the image. In order to avoid the problem of “Curse of Dimensionality”, Largevis dimensionality reduction method is utilized to reduce the dimensionality of the image feature vector, while improving its discriminative capability. The dimensionality reduced feature vector is utilized for image retrieval with L2 distance measurement metric. Experimental results on the datasets of RS19, UCM and RSSCN7 have demonstrated that, compared with the existing methods, the proposed method can obtain state-of-the-art retrieval performance.

This is a preview of subscription content, access via your institution.


Python 3.3+ includes mksalt in crypt, which makes it much easier (and more secure) to use:

If you don't provide an argument to crypt.mksalt (it could accept crypt.METHOD_CRYPT , . MD5 , SHA256 , and SHA512 ), it will use the strongest available.

The ID of the hash (number after the first $ ) is related to the method used:

  • 1 -> MD5
  • 2a -> Blowfish (not in mainline glibc added in some Linux distributions)
  • 5 -> SHA-256 (since glibc 2.7)
  • 6 -> SHA-512 (since glibc 2.7)

I'd recommend you look up what salts are and such and as per smallclamgers comment the difference between encryption and hashing.

Update 1: The string produced is suitable for shadow and kickstart scripts.
Update 2: Warning. If you are using a Mac, see the comment about using this in python on a mac where it doesn't seem to work as expected.

On macOS you should not use the versions above, because Python uses the system's version of crypt() which does not behave the same and uses insecure DES encryption. You can use this platform independent one liner (requires passlib – install with pip3 install passlib ):


2 Answers 2

If you use a deterministic encryption algorithm (so that you can actually verify passwords without the private key) it basically works like a backdoored hash. An attacker will be able to use a brute force or dictionary attack normally.

One obvious problem with any reversible encryption is that it reveals (at least something about) the password length. (E.g. if you use raw RSA you have to decide how to handle passwords longer than the modulus.) With a real password hash you can allow practically unlimited length without leaking anything about it if your database is compromised. Among other things, this may also allow an attacker to spend their resources on cracking the shortest passwords without wasting them on those that are too strong to crack.

You also miss out on the opportunity to use a slow and/or memory hard password hash. While asymmetric encryption is usually slower than a simple cryptographic hash would be, it is not slower by many orders of magnitude like a password hash allows.

Finally, there is a single point of failure (your private key) that allows leaking all the passwords in the database. If your key is, despite you precautions, compromised an attack can decrypt all the hashes. This means the scheme is at least theoretically weaker than password-hashing that has no back door and so requires every password to be attacker individually.

I would also recommend using established password hashing functions, though rather than either of the ones SEJPM mentions, I would recommend using whatever your programming language or available libraries already support. Whether that is bcrypt or scrypt.

If you really need the plaintext password to be recoverable, you could store both a strong password hash and a normal, non-deterministic public-key encryption of the password: $H(s, p)||E_(p)$. Password verification would be done using the slow password hash and possible recovery of the plaintext password would be done by decrypting. The limitations with regard to password length and a single point of failure that I mentioned above would still apply.


8 Answers 8

Why isn't it used? Because it's a lot of extra work for zero gain. Such a system would not be more secure. It might even be less secure because it gives the false impression of being more secure, leading users to adopt less secure practices (like password reuse, dictionary passwords, etc).

In theory this doesn't give any extra security, but in practice this can be used to protect against "rogue sites" that don't hash your password in the server.

How exactly does this protect you? It sounds like all you want to do is hash the hashed password which is sort of pointless. Because the hashed password would then become the password.

There are many sites on the Internet that require login information, and the only way to protect against password reusing is the "promise" that the passwords are hashed on the server, which is not always true.

How about not using the same password for more then one site. The reason websites hash the password in theory is to prevent access to your account if THEY are compromised. Using the same password for multiple websites is just stupid.

If you did use javascript, all the "hacker" would have to do is, use the same method on the hashed-hashed-passwords. Once you have the hashed information its just time it takes to compute the password->same hash in the database that is a factor preventing access to an account.

Because it would add little to no value. The reason hashing is that if your database gets hacked, the hacker would not have a list of valid password, just hashes. Therefore they could not impersonate any user. Your system has no knowledge of the password.

Secure comes from SSL certificates plus some form of authentication. I want my users to supply a password so I can calculate the hash from it.

Also, the hashing algorithum would be on the server in a more secure area. Putting it on the client, it's pretty easy to get the source code for Javascript, even if its hidden referenced scripts files.

most replies here seem to completely miss the point of client-side password hashing.

the point is not to secure access to the server that you are logging into, since intercepting a hash is no more secure than intercepting a plain text password.

the point is really to secure the user's password, which is usually far more valuable than individual site login access since most users will reuse their password for multiple sites (they shouldn't but the reality is that they do so it should not be waved off).

ssl is great for protecting against mitm attacks, but if a login application is compromised on the server, ssl won't protect your users. if someone has maliciously gained access to a web server, they will likely be able to intercept passwords in plain text because in most cases passwords are only hashed by a script on the server. then the attacker can try those passwords on other (usually more valuable) sites using similar usernames.

security is defense in depth, and client-side hashing simply adds another layer. remember that while protecting access to your own site is important, protecting the secrecy of the passwords of your users is far more important because of password reuse on other sites.

The solution is simpler than that. Client certificates. I create a client certificate on my machine. When I register with a website, we do a handshake using my client certificate and the server's certificate.

No passwords are exchanged and even if someone hacks the database all they'll have is the public key of my client certificate (which should be salted and encrypted by the server for an added level of security).

The client certificate can be stored on a smart card (and uploaded to a secure online vault using a master password).

The beauty of it all is it removes the concept of phishing away. you're never entering a password into a website, you're just handshaking with that website. All they get is your public key which is useless without a private key. The only susceptibility is finding a collision during a handshake and that would only work one time on a single website.

Microsoft tried to provide something like this in Windows with Cardspace and later submitted it as an open standard. OAuth is somewhat similar but it relies on an intermediated "issuing party". InfoCards on the other hand could be self issued. That's the real solution to the password problem. removing passwords altogether.

OAuth is a step in the right direction though.

It is definitely possible, and actually you do not need to wait for a website.

Have a look at SuperGenPass. It is a bookmarklet.

It simply recognizes passwords fields, concatenates what you type with the website domain, hash it, mangles it somewhat so as to get only "admitted" characters in the password, and only then is your hashed-password sent on the wire.

By using the site domain in the process, you thus get a unique password per site, even if you always reuse the same password.

It is not extremely secure (base64-MD5), but you perfectly distribute a sha-2 based implementation if you wished.

The only downside is if the domain change, in which case you'll need to ask the website to reset your password because you'll be unable to recover it by yourself. it does not happen often though, so I consider it an acceptable trade-off.

I like X4u's answer, but in my opinion something like it should be integrated into the browser/the html specification - as at the moment it's only half the answer.

Here's a problem I have as a user - I have no idea whether my password is going to be hashed at the other end when stored in the database. The lines between me and the server may well be encrypted but I have no idea what happens to my password once it reaches the destination - it maybe stored as plain text. The database admin guy may end up selling the database and before you know it the whole world knows your password.

Most users reuse passwords. Non technical people because they don't know any better. Technical people because once you get to the 15th password or so most people don't stand a chance of remembering them unless they write them down (Which we all know is also a bad idea).

If Chrome or IE or what ever it was I am using could tell me that a password box is instantly going to be client side hashed using a server generated salt and effectively sandbox the password itself - then I would know that as a user I could reuse a password with less risk. I'd still want the encrypted channels as well as I don't want any eaves dropping going on during transmission.

The user needs to know that their password is not even available to be sent to the server - only the hash. At present even using X4U's solution they have no way of knowing this is the case because you don't know if that technology is in use.

I think it's a good method to use when building something like a framework, CMS, forum software, etc., where you don't control the servers that it might be installed on. That is, YES, you should always recommend use of SSL for logins and logged-in activity, but some sites using your framework/cms won't have it, so they could still benefit from this.

As others have pointed out, the benefit here is NOT that a MITM attack couldn't allow someone else to log into this particular site as you, but rather that that attacker wouldn't then be able to use the same username/password combo to log into possibly dozens of other sites you might have accounts on.

Such a scheme should salt with either a random salt, or some combo of site-specific and username-specific salts, so that someone who gains the password can neither use it for the same username on other sites (even sites using the identical hashing scheme), nor against other users of the site site who might have the same password.

Others have suggested that users should create unique passwords for every single site they use, or use password managers. While this is sound advice in theory, we all know this is folly to rely on in the real world. The percentage of users who do either of these things is small and I doubt that will change any time soon.

So a javascript password hasher is kind of the least that a framework/cms developer can do to limit the damage of someone intercepting passwords in transit (which is easy over wifi networks these days) if both the site owner and the end users are being negligent about security (which they likely are).


What Is the Best Load Balancer Monitoring Tool on the Market?

Equally as important as addressing the “What is load balancing?” question, is knowing which of the many application load balancing software options is the best fit for your IT department. I’ve included a brief list of a few of my favorites to help narrow down your search. While each one of these application load balancers is unique, they all have one thing in common: helping you remain as efficient and effective as possible.

SolarWinds Network Performance Monitor

Network Performance Monitor is a multi-vendor network monitoring platform designed to provide system administrators with in-depth, real-time network insights. Wondering how to test server load balancing? This comprehensive network monitoring software displays all your devices, applications, networks, and vendors to help you quickly get to the root of any problems—including those pertaining to application load balancers.

Poorly functioning application load balancers lead to intermittent service outages or severe slowdowns. Network Performance Monitor helps keep these issues at bay by equipping you with the tools needed to quickly isolate the source of any load balancing issues. Once you know where the problem is originating, you can jump into action to resolve the issue. Take advantage of the 30-day fully functional NPM free trial.

SolarWinds Server & Application Monitor

Server & Application Monitor is another tool from the team at SolarWinds. This platform is designed to monitor your applications and their supporting infrastructure, including those running on-prem and in the cloud.

Through the tool’s AppInsight ™ application, you can closely monitor your internet information services (IIS) to evaluate current connections, post requests, and get requests. If it becomes apparent the value is too high for any one IIS server, then additional application load balancing techniques can be put into place to help minimize the burden on the server in question. You can try it out for yourself with a free 30-day trial.

Paessler PRTG Network Monitor

PRTG Network Monitor from Paessler is, as the name suggests, a tool designed to monitor your IT infrastructure and keep you abreast of problems before they bring productivity to a standstill. From an application load balancer perspective, this tool can be used to collect and view detailed data on all your global load balancers, firewalls, web servers, and traffic, to help you maintain traffic stability, ensure uptime, and monitor bandwidth consumption. It’s a comprehensive platform but can be costly for larger organizations to implement.


2 Answers 2

Ethereum uses KECCAK-256. It should be noted that it does not follow the FIPS-202 based standard (a.k.a SHA-3), which was finalized in August 2015.

According to this, NIST changed the padding to SHA3-256(M) = KECCAK [512] (M || 01, 256). This was different from the padding proposed by the Keccak team in The Keccak SHA-3 submission version 3 (final, winning version). The difference is the additional '01' bits appended to the message. People are now calling the "submitted version 3" SHA-3 Keccak hashing "Keccak" and the finalized NIST SHA-3 standard "SHA-3".

Using this online generator and the Solidity Online Compiler, I tested the difference between Keccak-256 and SHA3-256. I hashed the word testing using Ethereum and the 2 SHA3 hashing algorithms:


Watch the video: Top 4 Dying Programming Languages of 2019. by Clever Programmer