Good To Go Live?

This post does not make you hacker overnight, or by this you cannot hack a website, this is not a shortcut to become a security tester. This article is purely for helping individuals in building secured applications.

If you are from infosec, you might be tired of listening to people asking, is application good to go live, can I push my code to production? Below are some of the easy steps for developers for developing a secured application and this would reduce efforts and time of security team on vulnerability assessment and penetration testing. It is always advised to have a proper code with certain validation and verification.

1. Authentication:

a) Don’t hardcode credentials: Credentials should not be saved within the application. To some extent it might be useful during development stage but strongly advised not to be saved within the application code.

b) Store Database Credentials Securely: The credentials must be stored in a centralized location as a separate database for which authentication is required.

c) Error handling on invalid credentials: Error Messages on invalid credentials for authentication must be clear and must not reveal any sensitive data against username enumeration or other available fields.

d) Account lockout against bruteforce attacks: Account lockout policy needs to be implemented to safeguard against brute force attacks for both the login and forgot password features. The account should be locked for a certain period of time say 24/48 hours or until manually unlocked from the application support

e) Browser memory: Credentials should not be retained in browser memory after logout. The credentials should be encrypted with base64 or any secured hashing mechanism.

2. Authorization

a) Access control checks: Make sure that there is a separate mediatory between different privileges with a principle of common security assistance on the application.

b) Least Privileges: Users of an application should not provide root access instead minimal privileges could be given. If not explicitly allowed then access should be denied.

c) Don’t use direct object references: Do not allow direct referencing to any files or parameters that can be manipulated to grant access to the resources. Privileged access should be assigned based on individual user identity.

d) Validated redirects: An attacker can get access to private content without authentication, if redirection are not validated properly.

3. Input Validation and Output Encoding (Text Fields)

a) Output Encoding: All output operations must contextually encode data before displaying it on browser.

b) Set Encoding in the Application: For every page in your application set the encoding using HTTP headers or meta tags within HTML. This ensures that the encoding of the page is always defined and that browser will not have to determine the encoding on its own.

c) Prefer White-lists Over Blacklists: All the input fields should have input validation. White listing input is the preferred approach. Data that meets a certain criteria must be accepted.

d) Tokens against Forged Requests: Applications must embed random token that is not known to third parties into the HTML form for preventing CSRF (Cross-Site Request Forgery) attacks. This CSRF protection token must be unique to each request. This prevents a forged CSRF request from being submitted because the attacker does not know the value of this random token.

e) File Uploads: While accepting files from the user make sure to validate the size of the file, the file type, the file-name, the file contents and where is it going to save as well as ensuring that it is not possible to override the destination path for the file.

f) Parameterized SQL Queries: SQL queries should not be created dynamically using string concatenation. Similarly, the SQL query string used in a bound or parameterized query should never be dynamically built from user input.

g) Terminate/abort invalid inputs: This is a safety and final strategy on unaccepted characters that occur as input, but if implemented poorly it might lead to Denial Of Service attack in which attacker can flood the system with unexpected inputs, forcing the system to expend scarce processing and communication resources on rejecting it.

4. Session management

a) Session fixation: Session tokens must be changed during login and must have different tokens for logged in and logged out states.

b) Session variables invalidation: Session variables should be invalidated in the server after logout or session timeout after a certain period of idle session.

c) Unique session variables:  Session variables must be unique and should not be reused for different accounts.

d) Strong session variables: Session variables should be random and must be of certain length which is not easily guessable by the attacker.

e) Secure cookie variables: Usage of secure cookie attributes i.e., Httponly and Secure Flags. The cookies should be set to exact domain and path and the cookies should not be shared with other sub domains.

5. Communication Protocols

a) SSL: Clear text protocols such as HTTP is always prone to MITM (Man In The Middle) as an attacker can intercept requests in the network level hence SSL is recommended during authentication and post login pages.

b) Disable HTTP: Resources accessible on secured channel such as HTTPS must restrict access with clear text protocols such as HTTP

c) Salted passwords: Passwords must be stored using a secured algorithm and iteratively hashing with a random salt added to hash  makes more stronger to crack

d) SSL Certificates: SSL certificates used must be from a reputed CA signed with secure exchange keys and ciphers.

6. Logs

a) Maintain logs on all privileges: This includes all authentication activities, all privilege changes, administrative activities, access to sensitive data.

b) Secure your logs: Logs are to be saved carefully and maintained on a secure server against tampering to avoid from loss and logs should be maintained for a specific duration according to industry policies.

c) Improper logs: Maintaining appropriate logs and storing them are the important part of logging management. Sensitive information should not be part of logs and the entire log needs to be encrypted in certain situations.

7. Reset password

a) Once reset password link is used, link should be expired for the next use.

b) Till the user resets password, the previous password should not be disabled.

c) Even if the link is unused the reset password link needs to be expired within a defined time say 48 or 72 hours.

d) Reset password link should be over an SSL

e) Old/previous password reset link should be expired once new password link is generated.

f) Token used in reset password link should be mapped to the users email ID and should not be used to reset password of another user.

g) Token should not be sequential or easily guessable or a short one. It should be minimum of 16 characters so that it is not easily brute forced.

h) Password policy should be maintained on reset password page.

8. Error Handling

a) Generic error messages: Display generic error messages to the user, error messages should not reveal details about the application like technology used, internal IP or path, and stack-overflow errors.

b) Framework related errors: All framework generated errors must be generic or messages can be customized so that sensitive information about the framework is not revealed.

c) Unhandled exceptions: Exceptions are strictly advised to handle errors either at client or server side. It is good to have ‘finally’ code block for all unhandled errors with generic error message.

References:

Experience of Security testing on API’s

API

This blog post is all about my first experience with Security Testing on APIs. Back in 2013, I was working on a project “XYZ” in Moolya Software Testing. It all started on April 1st of 2013 when Pari, the “Master Shifu” sent me the mail with the details of the project, which is an advanced voicemail service especially for business people who do not want to miss the calls, at busy times. Ok, now what form is the product in? Is it web app/mobile app? No, those were API’s; we didn’t have any idea on APIs. First task was to understand API and its functionality. Learnt about the API from team members and by exploring on the internet. Understood what an API is, why are they there for an app and how they are used in an app?

An API — Application Programming Interface — allows your product to communicate with each other. In this way, an API allows you to open up data and functionality to other developers and to other businesses.

As API’s act as a backbone for any application, in simple words which are of pure technical stuffs which communicate between presentation layer and the database. API’s reside in business layer and we can test them for its core functionality. And testing functionality of the application in this state will help reducing maintenance of the application. This triggered me to test the API for its security, which would result in great impact on the product or the application. And that is where I decided to do security testing on API’s. Now coming back to the project, what are we supposed to test, what are the requirements, why are we testing, scope of testing, etc were some of the questions that arose.

After getting clarity on all these queries, we started testing. The next issue was to understand windows azure. These were not normal API’s but were built on different technology (Windows Azure). Initially we explored about this technology and started testing. Then there was a dilemma in selecting the tools for testing API’s. We have numerous tools for testing API, selecting the best tool which suites the project was difficult. Finally I landed up on selecting SoapUI, which helps in Manual/Automation testing. Continued with API automation, which was my primary task. Meanwhile, I was testing manually for its functionality.  I thought, why don’t I start security testing on API’s? Now the question was how? Where? Now comes the exploration part on testing API for its security. In simple words, how can a hacker misuse API, what are the effective attacks on API? Santhosh Tuppad who is a well-known security tester suggested me with a book called “Hacking web services” by Shreeraj Shah, which helped me in understanding what kind attacks can be done on APIs.

Now coming to the attacks mentioned in the book, some of the attacks were simply great, but most of them were outdated. Where today’s servers are built in such a way, where it can handle these kinds of attacks in the initial stages. Now coming back to the project, First task was to note different type of attacks or vulnerabilities.

While I was reading, Shreeraj Shah’s book on Hacking Web Services, I found out that they were traditional approaches. However, it was helpful in understanding several concepts. The challenge was to learn new techniques or new ways of hacking web services. There started my exploration! Well, I was doing black box testing, it all depends how much Quality we want to bring in with testing as a black box tester. While performing the web services testing, I didn’t actually look into the code of the application (like the White box tester’s do). I tried to cover all the possible scenarios’ using the tool (SoapUI). The “Groovy Script” came into picture when I started performing the check automation of regular testing on API’s. Though it is not necessary to learn Groovy to perform testing, it surely gives an extra advantage over other testers 😉

Tests that I performed to find vulnerabilities,

  • Insecure Direct object Reference

  • Information Leakage and improper error handling

  • Authentication and Certification/Permission and Access Issues

  • Authorization

  • SQL/LDAP/XPATH/OS Command Injection

  • Virus/Spyware/Malware Injection

  • Session/Parameter Tampering

  • Denial of Service/Large Payload

  • Brute-force

  • Data type mismatch/Content Spoofing

  • Information Leakage/Error leakage

  • Replay Attacks

  • Buffer Overflow

  • XML Parsing Attacks

  • Spoiling Schema

  • Complex and Recursive structure as payload

  • Fault Code Leaks/Poor Policies

Learning the basic level of Security on API was fun & easy. To master the same anyone will need to get good grip over basic vulnerabilities. As a starting point for above mentioned attacks, I searched forums & blogs for detailed information.

When malicious requests are executed number of technical layers can be targeted, including,

  • NIC (Network interface Card) and its drivers.

  • Operating System, as it processes the incoming request from the NIC.

  • Target application server that handles the request (for example Apache, IIS, etc.)

Let me explain this in a little more detail on how I could manage to accomplish the task, which is Security testing on API’s.

When I come to think of it – how would I identify security vulnerabilities in the first place? What response would I get back when trying, for example a SQL Injection attack that would allow me to identify that there is security vulnerability? Actually, this wasn’t that easy.

More often I will get an error message back that tells me that the service was unavailable to process the malicious request – this might be a good thing during development but that was one of the situations where I needed to don the hat of the hacker; does the error message tell me something I shouldn’t know about the system? For example which database they are using? Which version? Which language or framework? Maybe it exposes internal IP address? Any of these might be exactly what the hacker is looking for, the information allows them to target specific attacks on your application (there are publicly available databases of known security issues in most software like “Exploit DB”) which in turn might give them the backdoor, what an attacker would have been looking for. It’s something called as “Sensitive Information Exposure” it searches messages returned by application for any information that might expose system details to the hacker – version numbers, software technologies, etc.

Other scenario was tests resulting in a Denial of Service. These are unusual but valuable in the sense that, I found them first and can make sure that they won’t happen again so that the end users using the application are not affected. The actual responses to a Security Test and how to interpret them from a security point of view was one of the major issue which I faced, it requires corresponding system knowledge and understanding to be able report them, that would pin point exploits in responses which helped me the most.

Finally I conclude that,

  • Neglecting possible security vulnerabilities and related issues in your services and APIs can put your data and your business at serious risk.

  • Taking security seriously and building basic expertise around it is an investment well worth making.

  • The core mechanics of most security vulnerabilities are easy to understand, test for, and prevent.

Experience at Mozilla Summit 2013

It was May 21st Tuesday 2013, around 4:30AM IST.  I was on my bed struggling hard to sleep. I finally gave up and switched on my system to check my mails or just go online. A mail from Mozilla got rid of any lingering sleep. I was wide awake, yet strangely disoriented. [2013 Mozilla Summit!] It’s official, you’re invited!! am I dreaming! Later when I read everything, I was assured that it’s real and not spam 😉

I was pretty excited about the invite and was very happy to be a part of the summit.

Mozilla Summit 2013

The Mozilla Summit 2013, a 3 day event where most of the Mozillians meet, celebrate the success of Mozilla and make plans for the future advancements in web and technology

This was the first Mozilla-wide Summit in three years and we couldn’t be more excited! This year, the Summit was held in three different cities:

  1. Brussels, BE
  2. Santa Clara, CA, USA
  3. Toronto, ON, Canada

The Summit was organized simultaneously at all 3 places on October 3, 4 and 5th respectively

My time there

I was invited to attend Summit at Santa Clara, California. I was so excited to attend the Summit and meet Mozillians all over the world.

Finally on 3rd October, I was well packed and ready to fly and meet the family of Mozilla, the Mozillians.  It was my first flight journey, very excited and a bit scared as well ;), an enjoyable experience. When I landed at San Francisco, there was a huge team welcoming us with a big smile. We introduced ourselves to the team and we were taken to the hotel Santa Clara Marriott (the hub for most of the events in Santa Clara). We were shown to our rooms, where we rested and got refreshed

Finally the most awaited Mozilla Summit started with a warm welcome by Tristan, followed with a talk by Mitchell Baker about “What makes Mozilla Mozilla!” and series of other talks on strengths of Mozilla, how it is going to help the world, which inturn builds a healthy Mozillian.

Mozilla describes internet as OPEN with following mantra:

Know more, Do more, Do better!!

Know more about the world and emerging technology with internet. Do more without any kind of restrictions. Do better innovations to help the world.

World Fair was an event where Mozillians from different countries participated in the fair to demonstrate their work, different ideas and products created by the communities all over the world.

Next day started with the “Mozillian Pilates” by Pascal Finette  which was refreshing after a long journey from Bangalore, India to Santa Clara, California. Later, there was a talk by Tristan followed by similar talks. The Innovation fair which was the next event, was a platform where Mozillians could display their inventions they are working on. The focus of each Mozillian was same as everybody wanted to change the world with technology.

This was followed by different open sessions and the day was concluded with a Photo session of all Mozillians, one more memorable moment of the Summit.

894442_10151965578562090_505229502_o

Pillars of Mozilla:

  1. Build Products: Mozilla aim to build the WEB and want through great products, services and programs. As we do, make sure to delight and empower people across the internet and to show what is special about the web.
  2. Empower Communities: Communities that: lead, build and promote Mozilla’s work; and inspire and enable the Mozilla spirit in others.
  3. Teach and Learn: As a way to help people understand what “openness” feels like, how the web works and given them the knowledge they need to create, shape and control their own online lives.
  4. Shape Environments: We want to shape environments to embody and spread Mozilla’s vision and values. In particular, we aim to shape the consumer product environment towards openness, to shape the overall web platform to ensure that continues to embody the values in the Mozilla Manifesto, and to shape people’s mindset to understand and appreciate these values.

pillars

Totally Summit was extremely awesome to be able to meet so many Mozillians and the energy levels were very high. I lespecially liked the innovation part which was simply awesome and I was stunned with different innovations all together.If you ask me what I learnt, I would just say “Summit turned me from an introvert to an extrovert”. Mozillians taught me how to interact, and pitch into conversations of complete strangers and have a broader thinking level.

The Summit was an opportunity to celebrate the success and build the future with a fellow Mozillian, in simple words to understand who we are and what we have accomplished.  It was an amazing chance for me to connect,hack, and build a healthy relationship with people and community.

Five things I learned at Summit would be,

  1. Turning Ideas into action
  2. Blending passion with voice and actions
  3. Choosing our own adventure
  4. Importance of Community and Network
  5. Culture

I was at the right place. It’s so important to have a good, trusting relationship with community. Different people have different approaches. And what works for me may not work for others. But the summit had culture with same vision which is to improve the way internet is! The mishmash of research, expertise, responsiveness and approachability, and passion and kindness may be one of the reasons why I loved Summit this much.

Some of the Mozillians whom I met at  Summit 2013,

@bkerensa, @ajoysingha, @rahid, @bacharakis, @playingwithsid, @iamjayakumars, @amodnn, @vineelreddy, @rosanardila, @pfinette, @SujithReddyM, @MadalinaAna, @Sayak_Sarker, @NareshSundar, @midhunsss, @GalaxyK, @rtnpro,  @komal_Ji_gandhi, @kaustavdm, @H34V3NG0D, @neocatalyst, @haseeb_offline, @AjayJogawath, @chowdhury_sayan, @GauthRaj, @MerajImran

Reference: https://wiki.mozilla.org/Summit2013

Security Testing for Beginners

Recently, I authored an article at TestingCircus e-magazine and I like to publish the same on my blog so that I can reach out to my readers who are not subscribed to testingcircus or couldn’t read due to whatsoever reason(s).

Thanks to Mr. Santhosh Tuppad for encouraging me to write this article and Mr. Ajoy Singha for providing me an opportunity to write for TestingCircus and I am looking forward to continue my contribution to TestingCircus e-magazine by writing, you can find my article in the following link  http://www.testingcircus.com/testing-circus-2013-september-edition/

Without much ado, here I present you with my article.


There is no wrong way to start hacking, everything is right way and I have my own way. Whatever your style of hacking is, make sure it’s consistent. If you are starting out today you can be benefited based on your skill sets. Don’t learn to hack, hack to learn.

Well, coming to the point how did I start hacking or how did I land up here, It was in the year 2008. I was in my 2nd year diploma where one of my friends was trying to download videos by searching on Google. In 2008, getting a video to your local machine was one of the biggest achievements for people of my age. My friend showed me how to get the videos from Google, by extracting only videos from the vast search results. He asked me to enter some string along with the search query.

Filetype: avi <Search Query>

He didn’t know what it was, and he told that he came to know about it through his senior. Ok!! As I am very much interested in computer technologies, I tried to find out what they are. I referred to many articles and found that they are called as GOOGLE DORKS. I even came across some of the terminologies like White, Black and Grey hat hackers. During this phase, I got a common response from whoever I asked about hacking, which was “Hacking is very difficult and I don’t know anything on it except that it is illegal”.

But, it is not illegal as I told you before. There are 3 categories of hackers:

  • Black Hat Hackers
  • White Hat Hackers
  • Grey Hat Hackers

Black Hat hackers are those who perform undercover hacking for malicious reasons and also with intent to harm others, such people can also be referred to as ‘crackers’.

White Hat hackers are those who perform hacking for legitimate reasons and use their skills and knowledge for good, e.g. IT Security technicians testing their systems and researchers testing the limitations of any software.

Grey hat hacker is a combination of a black hat and a white hat hacker. A grey hat hacker may surf the internet and hack into a computer system for the sole purpose of notifying the administrator that their system has a security defect.

According to a survey the most common technique of hacking a website is SQL Injection. SQL Injection is a technique in which hacker insert SQL codes into web form to get Sensitive Information like (User Name, Passwords) to access the site and deface it. The traditional SQL injection method is quite difficult, but nowadays there are many tools available online through which any script kiddie can use SQL Injection to deface a website. Because of these tools, websites have become more vulnerable to these types of attacks. Some of the tools used for SQL Injection are mentioned in this article. However, as I know nothing is bug free and there will be exploits every minute/hour.

Some of the tools which help in finding the vulnerabilities are discussed below:

1.      Wireshark is also known as Etherea. It is one of the most powerful tools in a network security, as a network packet analyzer on any network. It is used to capture each packet sent to or from your system to the router. If you’re capturing on a wireless interface and have promiscuous mode (Admin/super user) enabled in your capture options, you’ll also see other packets on the network sent from different nodes. This also includes filters ex: DNS, TCP, UDP, ip.addr etc), color-coding, capturing packets and other features that let you dig deep into network traffic. Wireshark is an extremely powerful tool; this is just scratching the surface of what you can do with it. Professionals use it to debug network protocol implementations, examine security problems and inspect network protocol internals. To get this position, it takes a fair amount of practice. It takes practice to know how and where to capture right data, filters to use, and how to interpret the data.

People willing to learn can use this link to get sample captures on Wireshark to get experience hands on this http://wiki.wireshark.org/SampleCaptures

2.      Fiddler is an open source web debugging tool which captures all the traffic between your computer and the internet, it also acts as proxy between the browser or any application on the local machine and the internet say, all the traffic flows through the fiddler and the requests can be altered and the altered request is been sent to the server. In simple words fiddler sits between HTTP client that is the browser and the HTTP server.

Normally it would be configured with all the browsers being used on a particular machine or you may have to manually configure the browser to capture all the traffic in/out of our machine.

Fiddler can also be used to find the statistics, inspect the request or the response and can even act as an auto responder and is capable of sending request from the fiddler wit out any browser. Fiddler is designed in such a way that it capability to run API’s through composer functionality and can even right some scripts which can be helpful for check automation and has the capability to decrypt HTTPS traffic.

3.      Nessus, the first public release was in 1998. Nessus was an open source vulnerability scanner, recently nessus turned into a paid tool. This tool is used for scanning both web application and network, Network can be either internal or external IP/Network. Nessus is designed to automate the testing and discovery of known security problems. Allowing system administrators to correct problems before they are exploited.

Nessus uses a client server design that allows the user to set up one server that has multiple nessus clients that can attach and initiate vulnerability scans, where servers can be placed at various strategic points on a network allowing tests to be conducted from various points of view.

Nessus security checks vulnerabilities and database is updated on a daily basis which could be retrieved to cross check the database with the command “nessus-update-plug-ins”.

4.      IBM Rational AppScan is an automated web vulnerability scanner which helps in finding the vulnerabilities quickly and effectively, even a svan (semi technical person) can also use the tool and find vulnerabilities.  Using IBM app scan, we can decrease the risks in web application attacks and data breaches. It helps in testing the web application either on production site or on any staging sites which can ensure that it checks for web attacks.

Basically in IBM AppScan once you add a web app to test for its security the initial step is to crawl all the pages/links on that application which are allowed to be crawled based on robots.txt

Basic functionalities of IBM AppScan are

  1. Gives the larger coverage of test report
  2. It mainly concentrates on top 10 OWASP (Open Web Application Security Project) web application vulnerabilities.
  3. Accurate and advanced scanning algorithms used hence less false positives
  4. Recommendations, Which I personally like here, It gives us description of each vulnerability found and the risk involved in not fixing it.

As we all know automated scanning is not perfect all the time and is not advisable to completely depend on automated scanner, hence they have provided a manual scanning for any vulnerability found to give the perfect solution without false positives.

IBM app scan is a paid tool and it has a trial version as well if you are interested in exploring the application.

5.      Nmap, also known as “network mapper”, it is an open source application which helps in quickly scanning different ranges of devices such as desktops/laptops or any mobile devices and provide valuable information about the devices which are connected to a particular network. Nmap is available for all the platforms where it can be operated in 2 ways, command mode and GUI mode but most people prefer command mode for its advanced features but requires technical knowledge.

Nmap uses raw IP packets to determine what hosts are available on the network (Host Detection), the services that are enabled, the operating system and version, using TCP SYN or a TCP Connect ping to gather active hosts. Nmap is used by security researchers and hackers who want to find the weakness and exploit them.

Nmap can provide different types of scans, where some are more aggressive and some are simple, designed to be stealthy and scan undetected. Depending on the type of scan performed, different information can be discovered; some of the scans are Ping, SYN Stealth, UDP Scan, IP Protocol Scan, ACK Scan, RPC Scan, List Scan etc.

6.      Havij is an automated SQL Injection tool that helps hackers or security researchers to find and exploit SQL Injection vulnerabilities on a web page on a vulnerable web application, using Havij user can access database, retrieve DBMS users and  password, dump tables and columns, fetching data from the database, running SQL  statements and  executing commands on the  operating system.

Hackers use Havij along with vulnerability scanners such as IBM AppScan or Web Inspect, vulnerability scanners find vulnerabilities but not help you in actual exploitation and that’s where Havij showcases its functionality.  In other words, vulnerability scanners will help you in finding list of vulnerable webpage’s whereas; Havij helps you with the access to the database for entire exploitation.

Once URL is feed to the Havij, it comes up with a list of databases being used, version, and db-name’s. Later selecting a particular database we can drill down to tables, and then to columns and even to the actual data. Passwords would hashed usually, there are set of de-crypter’s  associated with the tool which help user to decrypt the hashed password, it is also associated with an algorithm which helps users to find the admin page of a particular web application. In simple words it’s more useful for hackers than security researchers.

7.      SQLMap is one of the most popular and powerful open source SQL injection automation tool, which is built on python and can run on any platform if python is installed in it.

Giving a vulnerable URL, SQLMap can exploit the database and provides with sensitive information like extracting database names, tables, columns, all the data in the tables etc. It can even read and write files on the remote file system under certain conditions.

We can run this application only on command mode and doesn’t have an interface, and has simple commands to extract information from the database.

Now I am on my own domain (solidmonster.com)! Wohoooo!

Dear readers, you have been following me on this blog which is on *.wordpress.com. I am freaking happy to announce that I have transferred all the blog posts to new domain and new hosting of my own. Please visit http://solidmonster.com/ from now on. Thanks all!

The experience starts

Moolya one of the craziest company which I have ever seen!!! Yes that’s true. Journey starts here, I came to know that Moolya is hiring fresh minds, I need to tell you about Moolya which u feel strange but that’s  true, I wanted to apply, but how will my resume be different, and I knew Moolya hires people who think differently and do things differently [THINK OUT OF THE BOX] and of course I’m one among them and I wanted to send my resume written by my own hands and not by typing and posting it to moolya. But I didn’t had enough time to do this, Now comes the question how to apply was one my question… started searching and exploring more about  Moolya then came the first thing, its blogs(1 year celebrations)..i was really impressed by them.. And comes the curious tester’s blog. I felt amazed reading them, finally i got the mail id of curioustester Miss Parimala Hariprasad, i sent mail briefing about me and mentioned that I’m seeking for an opportunity. I was eagerly waiting for the response and to my surprise I got a mail the next day morning i.e. 26th morning asking me to come for an interview the same day I was so happy that I got a interview call so soon but I was bit scared as I was not ready or prepared for interview. But with great positive energy, I said to myself “This is your chance utilize it, If not you can’t be Moolyavan”.

It was 11 am when I reached Moolya office for the interview. My first round of the interview was fantastic and I was surprised that I had enough knowledge/skill to impress or convince a Moolyavan(Vipin). Programming was my next round I was done with it and was waiting for the results of that round, in the mean time I heard people fighting and complaining like school kids 😀  One person goes on like “Pariiiii ask him to give back my note, It’s mine.”:P I was like,  WHAT THE HECK!!! These people are crazy they are having fun like kids, this was first instance which I observed a craziest side of a Moolyavan.

And finally results were out for the programming round, and I had cleared. Anjali (HR) called me inside for another round of interview. I was shocked as soon as I entered the cabin, 2 people ( Pradeep Soundararajan and Sunil Kumar ) welcomed me, I was like GOD “2 on 1” that’s ridiculous. But I also said myself “You know what you are and you have to prove it to these people” as I just settled they started with couple of standard interview questions and my past experience, skills, etc. They were really happy for my responses. Actually they were shocked for my responses and final verdict was out and I got the job. 🙂

There were couple of instances which made me feel wow about Moolya,

  • Employees had come to office in ¾ th pants.
  • Candidates appearing for the interview will be provided with a greeting card for spending their precious time.
  • Moolya treats every employee as a Co-Owner of the company.
  • The foosball table.
  • The party after work.

I joined Moolya on 29th Jan. I was so excited that I reached office by 8 AM and the office was not yet opened 😀  I roamed for a while and came back. And my first task was to assemble my system in my cubical, followed by a session on Software Testing by Master shifu Parimala. There’s company’s policy to take the newly joined employee for lunch and we had been out for lunch and the actual fun started by 4. Everyone was busy working on their projects and suddenly some people started telling Chocolate time and they gathered, I was shocked and was waiting for what it was. And I observed that I was in the middle of the gathering and I understood it was for an introduction and followed by RAGING:P yeah you heard it right everyone asked me to dance for a song. Worst thing is I don’t know to dance, in my life I never tried but somehow I managed to do 😀  and followed by other stuff on testing’s.

I just completed 1 month at Moolya, it is great time working in Moolya with a great pleasure of working differently in all the ways including this blog, I never imagined myself writing a blog and again it’s through Moolya’s inspiration. I’m really glad to work in Moolya for making me feel proud of myself.

It’s exactly one month I’m working here as a tester, 26th Feb a mail from Mozilla surprised me and made me go high. And guess what for the security bug I reported I got a bounty of $500 Dollars Yeah you heard it right, that’s true. A security bug in Mozilla which I reported made me enter “Mozilla Hall Of Fame” and a bug bounty which happened for the first time in my life. Again this credit goes to Moolya, my team and Ravi  who guided me in the right path.

Pari, special thanks to you, without you I wouldn’t have been here. All the credit goes to you. 🙂

Finally “I am here for a mission. And i want to fulfill it, with all the passion i carry and all the guidance i have. I wish a great journey awaits me with all the great people around me. For time being this is all what i want to share. Stay tuned to know more about my Software testing journey”.