How User Experience Designers Help You Achieve a Great User Experience

User experience design comprises the execution of a wide range of activities to create an experience that is appealing, fast, and easy to navigate. In short, creating an interactive interface that helps customers perform their tasks. Salesforce User Experience Designers (UX designers) are one of the most sought-after professionals in the digital industry today, with the job titles often being interchangeably used with User Experience Manager (UEM), User Interface Designer (UIX), and Product Designer.

This blog defines how people can use user experience design to boost productivity and to better connect with marketing segments.

What Is User Experience Design?

User experience design (UX) is the process of creating a product that feels like humans designed it. UX’s goal is to make digital products as intuitive, efficient, and enjoyable as possible.

User experience design focuses on the end user and how they interact with a product. It’s not just about aesthetics; it’s about ensuring your website or app works well on all types of devices and browsers with different screen sizes and resolutions.

It involves researching the target audience, understanding their needs and goals, and designing a product that meets those needs. People can use the Salesforce User-Experience-Designer study guides available on multiple platforms to gather important information about the subject.

How does User Experience Design Impacts productivity?

User experience design is not just about aesthetics. The process of creating a good user experience has many other benefits.

It helps users to save time and effort in the use of a product. It also saves money on maintenance costs, including human resources and equipment. A good user experience reduces the need for additional training or support from technical experts.

User experience design can make products easier to use, increasing productivity by reducing user error. For example, when you open an email in your browser, it should look like a web page instead of an application window with a limited amount of text or information displayed simultaneously.

Hence, making message viewing possible without having to click through multiple pages of information before finding what you’re looking for.

The Importance of User Experience Design

User-Experience-Designer pdf is critical for designers to define goals. User experience is the success of any business application. A well-designed application can help businesses increase productivity, while a poorly designed application will only lead to frustration and decreased productivity. Hence, businesses invest in UX design to ensure their applications deliver a great user experience.

User Research and Interaction Design

Conducting User-Experience-Designer practice tests and user research is essential to understanding how users interact with business applications. This research and User-Experience-Designer practice dumps help businesses identify areas where they can improve their application to deliver a better user experience. Once the research is complete, businesses can use interaction design to make the necessary changes.

Designing the Information Architecture

The User-Experience-Designer dumps and information architecture of an application are critical to its success. A well-designed information architecture will help users find the information they need quickly and easily. A poorly designed information architecture will only lead to confusion and frustration.

Creating prototypes and conducting user testing

Once the information architecture has been designed, businesses can create prototypes of their applications. These prototypes can then be user tested to ensure that they are easy to use and provide a great user experience.

Implementing UX design can help businesses boost productivity, save time and money, and better connect with their target market segments. By investing in UX design, businesses can create applications that deliver a great user experience and drive tangible results.

The Elements of Good User Experience Design

Good user experience design must consider the needs of users, application business goals, and the technical constraints of the platform. UX designers must balance these three factors to create an effective and intuitive application.

The user must be able to accomplish their goals with the application. This includes finding the information or features they need, understanding how to use the application, and feeling comfortable using it.

The business goals of the application must be achievable. This means that the application must be able to meet the specific needs of the business. For example, if the business goal is to increase sales, the application must be designed to make it easy for users to purchase products.

The technical constraints of the platform must be considered. This includes ensuring that the application is compatible with the devices that users will be using, including any other software the application must integrate.

Creating a great user experience requires a deep understanding of all three elements. UX designers must have a strong understanding of human behavior, old User-Experience-Designer study material, and the ability to think creatively and solve problems.

The Benefits of Good User Experience Design

Good user experience design can have benefits that can be summarized as follows:

  1. Helps businesses achieve their goals.
  2. Makes applications more intuitive and effective.
  3. Increases productivity.
  4. Reduces frustration.
  5. Can help businesses save money.

Good user experience design is essential to the success of any business application. By investing in UX design, businesses can create more effective, intuitive, and engaging applications, leading to increased productivity.

The Most Common Mistakes in UX Design

The most common mistakes in UX design are:

  1. Not understanding the user.
  2. Not considering the business goals.
  3. Not taking into account the technical constraints.
  4. Not thinking creatively.
  5. Not being able to solve problems.

These are the most common mistakes because they are the most important things to consider when designing applications. If businesses want to create applications that deliver great user experiences, they must take care to avoid these mistakes and look online for User-Experience-Designer questions answers that relate to their troubles.

Conclusion

For any DumpsCafe User experience designer, UX Design is a powerful tool to increase productivity and better connect with their target market segments. By improving the usability of business applications, creating a consistent and intuitive user interface, and allowing users to customize their work environments, businesses can boost productivity and create a better user experience.

CGI’s Cyber Escape Experience visits Lincoln

Leading IT solutions provider, CGI, brought its Cyber Escape experience to Lincoln as a part of a UK-wide tour. The escape room-style experience allowed staff, students, and local organisations to learn about online security risks in a fun and interactive way.

CGI’s Cyber Escape was hosted at the University of Lincoln, where staff and students were invited to visit the experience and put their knowledge to the test in a simulated “real-life” setting.  Teams from the Lincolnshire Community Health Services NHS Trust, Lincoln City Foundation and Ministry of Defence also joined throughout the week to explore. Attendees teamed up to beat the hacker and learn about cyber security risks and how to avoid them.

The experience is built within a shipping container and enables small groups to participate in a short “escape”. Teams must work together to uncover clues, solve puzzles, and accomplish cyber-related tasks to escape successfully in the time allowed. The experience allows users to test their cyber skills to see if they have what it takes to stay safe in our digital world. Participants learn critical skills in a simulated real-world setting through interactive activities, much like other escape rooms. Alongside this hands-on learning experience, CGI offered University of Lincoln students the opportunity to also join a talk, focused on Agile and hybrid ways of working in Space, Defence and Intelligence. It offered them a chance to learn from professionals and openly discuss the opportunities open to them for future graduate employment routes.

Donna Kelly for CGI in the UK said: “Cyber security is vitally important for individuals, companies, educational establishments, and our communities alike. We all spend much of our personal and professional lives online, and it is imperative everyone knows how to practice safe cyber techniques and can continue honing their existing skills. We were pleased to share the experience with the University of Lincoln, Lincolnshire Community Health Services NHS Trust, Lincoln City Foundation, and other organisations who attended. We look forward to continuing our engagement in the region after our involvement in the recent half-marathon and work with the Lincoln City Foundation.”

Yvonne James, Senior Lecturer in Computer Science (Cyber) & Programme Leader Cyber Security and Computer Networking from the University of Lincoln, said: “We were excited to bring this experience and the opportunity to discuss career routes to our students. They were able to learn the real-world implications of cybersecurity, the vulnerabilities people experience in a day-to-day situation, and receive exposure to career opportunities after university.”

The Cyber Escape experience was developed by CGI’s UK Cyber Security practice to train, educate and engage with businesses to help them better understand cyber security risks, complement any existing security awareness training, and increase knowledge of how to reduce the impact of a cyber-attack in a unique way. In conjunction with the organisation’s STEM@CGI team, it has tailored the experience for young people too, as they face evolving cyber threats and challenges online.

CGI’s Cyber Escape will be transported around the country to different locations for students and staff to participate in their cyber adventure, or arrangements may be made to visit the experience at a CGI location.

To find out more about CGI’s Cyber Escape experience and how you can get involved, visit: https://www.cgi.com/uk/cyberescape

The post CGI’s Cyber Escape Experience visits Lincoln appeared first on IT Security Guru.

Enable Wayland Support for Firefox/Chrome in Ubuntu 22.04 for Better Experience

Running Ubuntu 22.04 with the default Wayland session? You can switch your web browser’s backend to get even faster and smoother experience.

Firefox, Google Chrome and Chromium based web browsers do have native Wayland support, but they still use X11 as backend in Ubuntu desktop.

Since Ubuntu 22.04 by default logs into Wayland session, user can also change the web browser’s backend to get faster and smoother browsing experiences. I didn’t run any benchmark. But after switching to Wayland, my browser now has:

  • obviously better touchpad scrolling
  • 2-finger spread/pinch gestures to zoom in/out

Enable Wayland for Chrome/Chromium

For Google Chrome, Chromium and their based web browsers, e.g., Edge, Vivaldi, just type chrome://flags/ in address bar and hit Enter.

When the page opens, search for Preferred Ozone platform and use the dropdown menu to set it value to “Wayland“. Finally, click “Relaunch” button to apply change by restarting the web browser.

Chrome enable wayland

Native Wayland for Firefox

For firefox web browser, user need to edit the “/etc/environment” config file.

First, press Ctrl+Alt+T on keyboard to open terminal. When it opens, run command to open the file via gedit text editor:

sudo gedit /etc/environment

Replace gedit with your favorite text editor, such as gnome-text-editor for next Ubuntu 22.10 & Fedora.

When the file opens in text editor, just add a new line:

MOZ_ENABLE_WAYLAND=1

As the screenshot shows, you can also add more rules into this config file:

  • MUTTER_DEBUG_ENABLE_ATOMIC_KMS=0 – to fix a slightly laggy, slightly sloppy mouse response issue.
  • CLUTTER_PAINT=disable-dynamic-max-render-time to get a smoother frame rate.

After saving the changes in the config file, restart your computer to take effect!

via: Ubuntu Discourse

Secure Coding Practice – A Developer’s Learning Experience of Developing Secure Software Course

The original article appeared on the OpenSSF blog. The author, Harimohan Rajamohanan, is a Solution Architect and Full Stack Developer with Wipro Limited. Learn more about the Linux Foundation’s Developing Secure Software (LFD121) course

All software is under continuous attack today, so software architects and developers should focus on practical steps to improve information security. There are plenty of materials available online that talk about various aspects of secure development practices, but they are scattered across various articles and books. Recently, I had come across a course developed by the Open Source Security Foundation (OpenSSF), which is a part of the Linux Foundation, that is geared towards software developers, DevOps professionals, web application developers and others interested in learning the best practices of secure software development. My learning experience taking the DEVELOPING SECURE SOFTWARE (LFD121) course was positive, and I immediately started applying these learnings in my work as a software architect and developer.

“A useful trick for creating secure systems is to think like an attacker before you write the code or make a change to the code” – DEVELOPING SECURE SOFTWARE (LFD121)

My earlier understanding about software security was primarily focused on the authentication and the authorization of users. In this context the secure coding practices I was following were limited to:

No unauthorized read
No unauthorized modification
Ability to prove someone did something
Auditing and logging

It may not be broad enough to assume a software is secure if a strong authentication and authorization mechanism is present. Almost all application development today depends on open source software and it is important that developers verify the security of the open source chain of contributors and its dependencies. Recent vulnerability disclosures and supply chain attacks were an eye opener for me about the existing potential of vulnerabilities in open source software. The natural focus of majority of developers is to get the business logic working and deliver the code without any functional bugs.

The course gave me a comprehensive outlook on the secure development practices one should follow to defend from the kind of attacks that happen in modern day software.

What does risk management really mean?

The course has detailed practical advice on considering security as part of the requirements of a system. Being part of various global system integrators for over a decade, I was tasked to develop application software for my customers. The functional requirements were typically written down in such projects but covered only a few aspects of security in terms of user authentication and authorization. Documenting the security requirement in detail will help developers and future maintainers of the software to have an idea of what the system is trying to accomplish for security.

Key takeaways on risk assessment:

Analyze security basics including risk management, the “CIA” triad, and requirements
Apply secure design principles such as least privilege, complete mediation, and input validation
Supply chain evaluation tips on how to reuse software with security in mind, including selecting, downloading, installing, and updating such software
Document the high-level security requirements in one place

Secure design principles while designing a software solution

Design principles are guides based on experience and practice. The software will generally be secure if you apply the secure design principles. This course covers a broad spectrum of design principles in terms of the components you trust and the components you do not trust. The key principles I learned from the course that guide me in my present-day software design areas are:

The user and program should operate using the least privilege. This limits the damage from error or attack.
Every data access or manipulation attempt should be verified and authorized using a mechanism that cannot be bypassed.
Access to systems should be based on more than one condition. How do you prove the identity of the authenticated user is who they claimed to be? Software should support two-factor authentication.
The user interface should be designed for ease of use to make sure users routinely and automatically use the protection mechanisms correctly.
Importance of understanding what kind of attackers you expect to counter.

A few examples on how I applied the secure design principles in my solution designs:

The solutions I build often use a database. I have used the SQL GRANT command to limit the privilege the program gets. In particular, the DELETE privilege is not given to any program. And I have implemented a soft delete mechanism in the program that sets the column “active = false” in the table for delete use cases.
The recent software designs I have been doing are based on microservice architecture where there is a clear separation between the GUI and backend services. Each part of the overall solution is authenticated separately. This may minimize the attack surface.
Client-side input validation is limited to counter accidental mistakes. But the actual input validation happens at the server side. The API end points validates all the inputs thoroughly before processing it. For instance, a PUT API not just validates the resource modification inputs, but also makes sure that the resource is present in the database before proceeding with the update.
Updates are allowed only if the user consuming the API is authorized to do it.
Databases are not directly accessible for use by a client application.
All the secrets like cryptographic keys and passwords are maintained outside the program in a secure vault. This is mainly to avoid secrets in source code going into version control systems.
I have started to look for OpenSSF Best Practices Badge while selecting open source software and libraries in my programs. I also look for the security posture of open source software by checking the OpenSSF scorecards score.
Another practice I follow while using open source software is to check whether the software is maintained. Are there recent releases or announcements from the community?

Secure coding practices

In my opinion, this course covers almost all aspects of secure coding practices that a developer should focus on. The key focus areas include:

Input validations
How to validate numbers
Key issues with text, including Unicode and locales
Usage of regular expression to validate text input
Importance of minimizing the attack surfaces
Secure defaults and secure startup.

For example, apply API input validation on IDs to make sure that records belonging to those IDs exists in the database. This reduces the attack surface. Also make sure first that the object in the object modify request exists in the database.

Process data securely
Importance of treating untrusted data as dangerous
Avoid default and hardcoded credentials
Understand the memory safety problems such as out-of-bounds reads or writes, double-free, and use-after-free
Avoid undefined behavior
Call out to other programs
Securely call other programs
How to counter injection attacks such as SQL injection and OS command injection
Securely handle file names and file paths
Send output
Securely send output
How to counter Cross-Site scripting (XSS) attacks
Use HTTP hardening headers including Content Security Policy (CSP)
Prevent common output related vulnerability in web applications
How to securely format strings and templates.

Conclusion

“Security is a process – a journey – and not a simple endpoint” – DEVELOPING SECURE SOFTWARE (LFD121)

This course gives a practical guidance approach for you to develop secure software while considering security requirement, secure design principles, counter common implementation mistakes, tools to detect problems before you ship the code, promptly handle vulnerability reports. I strongly recommend this course and the certification to all developers out there.

About the author

Harimohan Rajamohanan is a Solution Architect and Full Stack Developer, Open Source Program Office, Lab45, Wipro Limited. He is an open source software enthusiast and worked in areas such as application modernization, digital transformation, and cloud native computing. Major focus areas are software supply chain security and observability.

The post Secure Coding Practice – A Developer’s Learning Experience of Developing Secure Software Course appeared first on Linux Foundation.

The post Secure Coding Practice – A Developer’s Learning Experience of Developing Secure Software Course appeared first on Linux.com.

LFX’22 Mentorship Experience with Open Horizon

The following post originally appeared on Medium. The author, Ruchi Pakhle, participated in our LFX Mentorship program this past spring.

echo “amazing experience”

Hey everyone!
I am Ruchi Pakhle currently pursuing my Bachelor’s in Computer Engineering from MGM’s College of Engineering & Technology. I am a passionate developer and an open-source enthusiast. I recently graduated from LFX Mentorship Program. In this blog post, I will share my experience of contributing to Open Horizon, a platform for deploying container-based workloads and related machine learning models to compute nodes/clusters on edge.

Background

I have been an active contributor to open-source projects via different programs like GirlScript Summer of Code, Script Winter of Code & so on.. through these programs I contributed to different beginner-level open-source projects. After almost doing this for a year, I contributed to different organizations for different projects including documentation and code. On a very random morning applications for LFX were opened up and I saw various posts on LinkedIn among that posts one post was of my very dear friend Unnati Chhabra, she had just graduated from the program and hence I went ahead and checked the organization that was a fit as per my skill set and decided to give it a shot.

Why did I apply to Open Horizon?

I was very interested in DevOps and Cloud Native technologies and I wanted to get started with them but have been procrastinating a lot and did not know how to pave my path ahead. I was constantly looking for opportunities that I can get my hands on. And as Open Horizon works exactly on DevOps and Cloud Native technologies, I straight away applied to their project and they had two slots open for the spring cohort. I joined their element channel and started becoming active by contributing to the project, engaging with the community, and also started to read more about the architecture and tried to understand it well by referring to their youtube videos. You can contribute to Open Horizon here.

Application process

Linux Foundation opens LFX mentorship applications thrice a year: one in spring, one in summer, and the winter cohort, each cohort being for a span of 3 months. I applied to the winter cohort for which the applications opened up around February 2022 and I submitted my application on 4th February 2022 for the Open Horizon Project. I remember there were three documents mandatory for submitting the application:

1. Updated Resume/CV

2. Cover Letter

(this is very very important in terms of your selection so cover everything in your cover letter and maybe add links to your projects, achievements, or wherever you think they can add great value)

The cover letter should cover these points primarily

How did you find out about our mentorship program?
Why are you interested in this program?
What experience and knowledge/skills do you have that are applicable to this program?
What do you hope to get out of this mentorship experience?

3. A permission document from your university stating they have no obligation over the entire span of the mentorship was also required (this depends on org to org and may not be asked as well)

Selection Mail

The LFX acceptance mail was a major achievement for me as at that period of time I was constantly getting rejections and I had absolutely no idea about how things were gonna work out for me. I was constantly doubting myself and hence this mail not only boosted my confidence but also gave me a ray of hope of achieving things by working hard towards it consistently. A major thanks to my mentor, Joe Pearson, and Troy Fine for believing in me and giving me this opportunity.

My Mentorship Journey

Starting off from the day I applied to the LFX until getting selected as an LFX Mentee and working successfully for over 3 months and a half, it felt surreal. I have been contributing to open-source projects and organizations before. But being a part of LFX gave me such a huge learning curve and a sense of credibility and ownership that I got here wouldn’t have gotten anywhere else.

I have been contributing to open-source projects and organizations before. But being a part of LFX gave me such a huge learning curve and a sense of credibility and ownership that I got here wouldn’t have gotten anywhere else.

I still remember setting up the mgmt-hub all-in-one script locally and I thought it was just a cakewalk, well it was not. I literally used to try every single day to run the script but somehow it would end up giving some errors, I used to google them and apply the results but still, it would fail. But one thing which I consistently did was share my progress regularly with my mentor, Troy no matter if the script used to fail but still I used to communicate that with Troy, I would send him logs and he used to give me some probable solutions for the same but still the script used to fail. I then messaged in the open-horizon-examples group and Joe used to help with my doubts, a huge thanks to him and Troy for helping me figure out things patiently. After over a month on April 1st, the script got successfully executed and then I started to work on the issues assigned by Troy.

These three months taught me to be consistent no matter what the circumstances are and work patiently which I wouldn’t have learned in my college. This experience would no doubt make me a better developer and engineer along with the best practices followed. A timeline of my journey has been shared here.

Checkout my contributions here
Checkout open-horizon-services repo

Concluding the program

The LFX Mentorship Program was a great great experience and I did get a great learning curve which I wouldn’t have gotten any other way. The program not only encourages developers to kick-start their open-source journey but also provides some great perks like networking, and learning from the best minds. I would like to thank my mentors Joe Pearson, Troy Fine, and Glen Darling because without their support and patience this wouldn’t have been possible. I would be forever grateful for this opportunity.

Special thanks to my mentor Troy for always being patient with me. These kind words would remain with me always although the program would have ended.

The LF Edge Mentorship program is always a great learning experience, and this year was no exception. Because of Ruchi’s work we now have more services following our best practice policies in the open-horizon-services github repository. Despite the time difference she was always flexible when it came to our sync-ups and was never afraid to ask questions or for clarification if something wasn’t clear. I hope Ruchi will continue to provide the meaningful contributions to the Open Horizon project I have seen her demonstrate throughout this mentorship program.

And yes how can I forget to plug in the awesome swags, special thanks, and gratitude to my mentor Joe Pearson for sending me such cool swags and this super cool note

If you have any queries, connect with me on LinkedIn or Twitter and I would be happy to help you out

The post LFX’22 Mentorship Experience with Open Horizon appeared first on Linux Foundation.

The post LFX’22 Mentorship Experience with Open Horizon appeared first on Linux.com.

11 Pro Vim Tips to Get Better Editing Experience

11 Pro Vim Tips to Get Better Editing Experience

The Vim editor is like an ocean – wonderful and joyful to be in, but there will always be things you don’t know.

While you cannot discover the ocean alone, you can always learn from others’ experiences.

I am sharing a few tips in this article that will help you use Vim like a pro.

I use them regularly and I have seen expert Vim users sharing them in various communities.

You should add them to your vimrc file, wherever applicable. You’ll have a better and smoother experience using the ever-versatile Vim editor. Trust me on this.

1: Always use the built-in help

I can not stress this enough. The biggest and least used tip is “RTFM” (Read the f**king manual).

Obviously, there is the Internet, humanity’s biggest collective resource for untapped knowledge, but what happens when Stack Overflow goes down?

Getting yourself habituated to Vim’s built-in help is the biggest favour you can do for yourself.

The syntax for looking at Vim’s internal help is as follows:

Prefix Example Context
: :help :w Help regarding the ‘:w’ command
none :help j Help regarding ‘j’ key in context to Normal mode
v_ :help v_J Help about using ‘J’ key in context to Visual mode
i_ :help i_<Esc> Help about using ‘Esc’ key in context to Insert mode
/ :help /n Help about search pattern ‘n’

2: Open as normal user, save as root user

In my memory of editing system files, you can easily forget adding sudo before editing a file in Vim. This opens a file in the ‘readonly’ mode. Meaning you can not write anything to it.

But you might have made some significant changes. And there might be no way of remembering every single edit you made. Hence, exiting with unsaved work is not an option.

11 Pro Vim Tips to Get Better Editing Experience

In those scenarios, type the following command in Vim:

:w !sudo tee %

Once you type this command, you will be asked for the password for sudo command. Enter that, and your changes will be saved.

💡
You should use the sudoedit command instead of sudo vim for editing files that require superuser privilages.

Let us break this down and understand what is happening here…

  • :w – This is the write command. Since no argument is given, Vim will write the whole file to standard output.
  • !sudo – Run the ‘sudo’ command as a shell command, not as a Vim command
  • tee – The ‘tee’ command is used to read from standard input and write it either to standard output or to a file
  • % – Vim substitutes this by the name of the current file that you are editing.

The :w command writes the whole file to STDOUT (standard output). Then, we use the sudo command (since what we are editing is a system file after all) to obtain temporary privilege.

The percent sign (%) represents our filename and the tee command takes Vim’s output from STDOUT and writes it to the % file.

This essentially works out to <Vim's STDOUT> | sudo tee /etc/ssh/sshd_config. A bit complex initially, but so is Vim 😉

3: Convert all spaces to tabs and vice-a-versa

We all have a preference for using either tabs or spaces over the other.

But what if you are editing an indented text file contradicting your preference?

3.1: Covert all spaces to tabs

When the current file is intended to use spaces, and you wish to convert them to tabs, there are two Vim commands that you need to run.

These two commands are as follows:

:set noexpandtab
:retab!

Doing so will convert all spaces to their equivalent of tabs. If the document uses two spaces as indentation width, they will be converted to 1 tab. If four spaces are used as a single indentation width, those four tabs will be replaced with one tab character.

3.2: Convert all tabs to spaces

If the file you are editing is intended with tabs and you want to convert the tabs to spaces, there are four Vim commands you must run.

:set expandtab
:set tabstop=4
:set shiftwidth=4
:retab

The first command (expandtab) tells Vim to expand tabs with spaces. The second command (tabstop) command how many spaces are used as one ‘indentation block’.

In our case, we are defining “1 tab = 4 spaces”. The shiftwidth command is used to control indentation when using >> operator, this too, is set to 4 spaces.

Finally, the retab command converts all tabs (that are used for indentation) to spaces.

4: Indent all lines

Wrongly indented lines can create havoc for Python and YAML programs.

To indent all lines, press the gg key to reach the top. Then press the = key to denote ‘indent’ and finally press the G key to denote ‘last line’.

Repeat with me; it is gg=G key combination to indent all lines.

This will automatically indent (to the best of Vim’s ability) all lines from the first line to the last line.

Below is a demonstration where I indent Rust code using the :gg=G command.

11 Pro Vim Tips to Get Better Editing Experience

As you can see (from this limited preview), all the lines are correctly indented.

The icing is that lines do not have to be wrongly indented to use Vim’s indentation.

5: Preserve indentation when you paste code

Admit it; we all have copy pasted code from the internet at least once. But what to do when the indentation gets messed up when you paste it?

To avoid that, add the following line to your .vimrc file:

set pastetoggle=<F2>

With this change to your vimrc file, press the F2 key before you paste code. Doing so will ensure that your code gets pasted with the correct indentation.

6: Start writing with the correct indent depth

This is a handy trick that I learned only recently. Suppose you are on the first column of a line, but what you write needs to be indented.

How do you do that in a smart way? Without pressing tabs/spaces?

The answer is to use the S key in Normal mode.

When you are on the first column of a line, enter the Normal mode by pressing Esc key. Then press the S (uppercase) key. This will move your cursor to the appropriate indent depth and automatically enter into Insert mode so that you can start typing.

11 Pro Vim Tips to Get Better Editing Experience

You can see, in this demonstration, my cursor was on the first column, and by pressing the S key, the cursor moved to the correct indent depth and Vim switched from Normal mode to Insert mode.

Pretty neat!

7: Show diff before saving the file

We have all been there. “I modified this file, but don’t know what I changed and now I am afraid the change will cause unexpected issues down the road.”

The remedy to this problem is to view the difference between the buffer and the file.

To do so, execute the following command in Vim itself:

:w !diff % -

Let’s break this down so you understand what is happening…

  • :w is the Vim command to save/write. In this particular scenario, where no file name is specified in the command, the output is written to the STDIN (standard input) file.
  • :!<command> is the syntax for executing a shell command. In our case, we are running the diff command in our shell.
  • % represents the name of the current file that is unmodified. Try this with :!echo %.
  • - is the STDIN file for the diff command.

So, this command first writes all the [unsaved] content to the STDIN file. Then the diff command reads the current file (%) and comparing it against the STDIN (-) file.

This command roughly equates to this shell command -> diff <original-file> <Vim's STDOUT>.

8: Show spelling mistakes

If you have been using only Vim ever since the beginning, good for you! But some people are also introduced to word processing software like Microsoft Word.

It has a feature (or a curse, for people with non-English names), where the spell-checker of MS Word places a red squiggly line under a misspelled word.

That feature might appear to be “missing” from Vim. Well, not exactly.

Vim has a spell checker built into it. You can enable it using the following command:

:set spell

Upon doing this, you might see misspelled words get highlighted. The way they are highlighted depends on your Vim color scheme. I get a white underline under misspelled words here.

11 Pro Vim Tips to Get Better Editing Experience

Your mileage may vary with the method of highlighting a word.

To make this the default Vim behaviour, you can add the following line to your .vimrc file:

set spell

9: Show line numbers

Like many Vim users, you might have tried to enable line numbers in Vim.

There are two methods to indicate the line numbers. One is Absolute line numbering. In this, you get the absolute number for each line, just like any other code editor or IDE.

The second is Relative line numbering. In this, the current line gets the number ‘0’ and every other line gets a relative number in context to the line on which the cursor is.

If you liked both, but had to make a tough choice of choosing one over the other, you are not alone. But you also don’t have to choose one over the other. You can have both!

You can enable “Hybrid line numbering” in Vim by adding the following line to your .vimrc:

set number relativenumber

This will show the absolute line number on the line with your cursor and relative line numbers for other lines.

Below is a screenshot demonstrating how it Hybrid line numbering works:

11 Pro Vim Tips to Get Better Editing Experience

Currently, my cursor is at the 44th line, so that line has the absolute line number. But the lines above and below my cursor have a relative number with respect to the line which has the cursor.

10: Open Vim with the cursor on a particular line

There might have been times in your past when you wanted to open Vim with the cursor set to a particular line instead of the first line.

This can be done by making use of the +linenum option. Below is the syntax to do so:

vim +linenum FILE

Replace the word linenum with an actual number.

11 Pro Vim Tips to Get Better Editing Experience

Here you can see that I open the /etc/ssh/sshd_config file with my cursor on the lines 20 and 50. That was done using the +linenum option.

11: Use readable color schemes

When it comes to using color schemes, people often choose the ones that they find most attractive or aethetically pleasing. But when you Vim as a code editor cum IDE, it is nice to give up some eye candy in favour of getting colorschemes with better visual guides.

A good colorscheme only looks good, but an excellent colorscheme helps you easily identify keywords, variables and other identifiers with the help of colors.

A few of my personal favorite colorschemes are as follows:

If you are not sure about how to use colorschemes in Vim, we have it covered on Linux Handbook 🙂

Bonus Tip: Delete text even when you are in Insert mode

We all know that you can use the d and x keys to delete text when you are in Normal mode. But what if you want something akin to dd in Insert mode?

In that case, below are a few key bindings that you can use:

  • Ctrl + w: Delete previous word (equivalent to db in Normal mode)
  • Ctrl + h: Delete previous character
  • Ctrl + u: Delete all the previous characters until the start of line (equivalent to d0 in Normal mode)
  • Ctrl + k: Delete all the leading characters until the end of line (equivalent to d$ in Normal mode)

Conclusion

I shared some of my favorite Vim tips. It involves things like indenting the whole file, getting a file diff before saving, opening a file with cursor at a particular line, and more…

Some are minor productivity pushes, while others – saving as root user – can be critical productivity boosters.

Comment and let me know which tip(s) you will be using. Is there any other cool Vim trick you are proud of? Share it with the rest of us.

Enabling Open Source Projects with Impactful Engineering Experience

This post originally appeared on the FINOS Community Blog. The author, James McLeod, is the Director of Community at the Fintech Open Source Foundation, a project of the Linux Foundation. You may also want to listen to the Open Source in Finance podcast

I often talk about “engineering experience” and the importance for open source projects to provide fast, easy and impactful ways for open source consumers to realise return on engagement. Just like e-commerce stores that invest in user experience to encourage repeat sales, successful open source projects provide a slick installation, well written contextual documentation and a very compelling engagement model that encourages collaboration.

In fact, within the open source community, it’s possible to drive commitment to open source projects through “engineering experience”. Successful projects develop lives of their own and build communities of thousands that flock to repos, Meetups and in-person events.

This article is focused on the “engineering experience” related to automation and deployment, but future articles will also cover providing an engaging README.md, contextual documentation and the workflows needed to engage new and experienced open source contributors.

ENGINEERING EXPERIENCE PROVIDES DAY ZERO OPEN SOURCE VALUE

The risk of ignoring an open source project’s “engineering experience” is the project becoming a lifeless repository waiting for a community to discover them. Imagine the questions that have been answered in dormant repos that could be solving real world problems if engagement was easy.

At FINOS we’re driven to provide day zero value to financial services engineers looking to utilise FINOS open source projects. This philosophy is demonstrated by FINOS projects like LegendWaltzPerspective and FDC3 that engage in open source methodologies for ease of installation.

Without engaging in a healthy “engineering experience”, engineer teams might find themselves working through reams of documentation, setting flags and system settings that could take days to configure and test against each and every operating system on their route to production.

The scenario highlighted above has been mitigated by FINOS projects Legend and Waltz by using Juju and Charms, an open source framework that enables easy installation and automated operations across hybrid cloud environments. Without Juju and Charms, Legend and Waltz would need to be manually installed and configured for every single project instance.

By engaging Juju and Charms, Legend and Waltz are shipped using a method that enables the projects to be installed across the software development lifecycle. This accelerator provides a positive “engineering experience” whilst increasing engineering velocity and saving development and infrastructure costs.

From the very first point of contact, open source projects should be smooth and simple to understand, install, deploy and leverage. The first set of people an open source project will meet on its journey to success is the humble developer looking for tools to accelerate projects.

Hybrid cloud and containerisation is a powerful example of how projects should be presented to engineers to vastly improve end-to-end engineering experience, another is the entire node.js and JavaScript ecosystem.

ENGINEERING EXPERIENCE ENABLES NODE.JS AND JAVASCRIPT OPEN SOURCE DEVELOPMENT

Take node.js and the various ways the node ecosystem can be maintained. I’m a massive fan of Node Version Manager, an open source project that enables the node community to install and traverse versions of node from a simple and easy to engage command line tool.

Node Version Manager removes the requirement to install, uninstall and reinstall different versions of node on your computer from downloaded binaries. Node Version Manager runs on your local computer and manages the version of node needed with simple bash commands.

After installing nvm with a simple curl of the latest install.sh, Node Version Manager is now running on your local computer, Mac in my case, and node can be installed with nvm install node. Such a simple way to keep the node.js community engaged, updated and supported. Not only this, but the vast open source world of JavaScript can now be leveraged.

With Node Version Manager provided as an open source tool, the further “engineering experience” of yarn and npm can be explored. Which enables FINOS projects, like Perspective and FDC3, to be installed using node.js to accelerate the financial services industry with simple commands like yarn add @finos/perspective and yarn add @finos/fdc3.

The chaining together of “engineering experience”, that removes the pain of manual configuration by leveraging containers and command line automation, not only invites experimentation, but it’s contributed greatly to the exponential success of open source itself.

As the articles move through the different ways to engage open source communities to make open source projects successful, it would be great to hear your “engineering experience” experiences by emailing james.mcleod@finos.org or by raising a GitHub issue on the FINOS Community Repo.

The post Enabling Open Source Projects with Impactful Engineering Experience appeared first on Linux Foundation.

The post Enabling Open Source Projects with Impactful Engineering Experience appeared first on Linux.com.