
Data Structure
Networking
RDBMS
Operating System
Java
MS Excel
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
Found 10401 Articles for Python

334 Views
In this article we will learn about machine learning program which can find the word Analogy from the provided word. Take an example "Apple : fruit :: car :vehicle". In this analogy, "apple" and "car" are the two things being compared. "Fruit" and "vehicle" are the two categories that the things being compared belong to. The analogy is saying that apple is a type of fruit, just as car is a type of vehicle. So the human brain can identify the pattern but training machine to do the same task will be very difficult as we will require very very ... Read More

291 Views
In this article we will learn about histograms and we will see detailed view about histogram and its various types. We will also draw see implementation using python. Histogram Histogram provides us visual representation of data, it is used to shows bar chart for numerical data. We can visualize the different distributions and patterns in the dataset. X-axis in the histogram is used to denote the range of values and the y-axis is used to denote the frequency or count of data points. Applications of Histogram 1. Analysis of Data Distribution We use histogram to analyze the data distribution and ... Read More

838 Views
This article we will learn about ANN and its types. We will also see programs on different types of Activation. Before diving into the types let's get to know what ANN is. ANN Artificial neural network (ANN) is a branch of machine learning which performs computation by forming a structure of biological neural network where each neuron can transmit the signal or processed data to other connected neurons. This structure is similar to the human brain in which neurons are interconnected to each other. Neural network is created when a connection of nodes or neurons forms a connection. Artificial ... Read More

520 Views
In this article you will learn about machine learning, image classification and how to use the googles teachable machine for training the models. Machine learning Machine learning is a subset of AI (artificial intelligence) which is used to develop the models and algorithms using which we can make our computers learn and make the decision without programming it explicitly. This is an effective way to teach machines to learn from the given data and improve the performance by time. Computers can learn the task and make predictions or find any pattern using data which shows the scenario about what we ... Read More

172 Views
In this article we will learn about the role of Text to text Transfer Transformer technique in Data Augmentation and how we can use this technique to improve the NLP model. In the current tech scenario Natural Language Processing has observed very rapid advancement in data augmentation field. Data Augmentation is used to improve the performance of model which is based on natural language processing (NLP). There are many techniques available using which we can achieve this in which one technique is Text to Text Transfer Transformer(T5) technique. We can use this technique for performing multiple NLP tasks by using ... Read More

125 Views
In this article we will learn about various methods using which we can find products of uneven size matrix columns. Working with matrices is very common for fields like data analysis, machine learning, so there can be situations where we have to find the matrix column product which can be a challenging task. Let’s see some examples for finding the uneven size matrix column product − Method 1: Using a Simple Loop In this method we will use the concept of simple nested loop and we will iterate through the matrix column and compute their products. Example def col_product_loop(mat): ... Read More

546 Views
Introduction Extraction of useful information from high-dimensional datasets is made easier by Principal component analysis, (PCA) a popular dimensionality reduction method. It does this by re-projecting data onto a different axis, where the highest variance can be captured. The complexity of the dataset is reduced while its basic structure is preserved by PCA. It helps with things like feature selection, data compression, and noise reduction in data analysis, and it can even reduce the dimensionality of the data being analyzed. Image processing, bioinformatics, economics, and the social sciences are just a few of the places PCA has been put to use. ... Read More
814 Views
Statistical simulation is the task of making use of computer based methods in order to generate random samples from a probability distribution so that we can model and analyse complex systems which exhibit random behaviour. In this article we are going to see how to make use of this powerful tool in Python to make predictions, generate insights as well as evaluate the performance of statistical algorithms. There are different types of statistical simulations, which are as follows: Monte Carlo simulations − Generation of random samples from a probability distribution in order to estimate the expected value of a ... Read More
717 Views
A network is a collection of nodes and edges that represent the relationships or connections between those nodes. The nodes can represent various entities, such as individuals, organizations, genes, or websites, while the edges represent the connections or interactions between them. Network analysis is the study of the relationships between these entities are node represented as a network. In this article, we are going to see how to implement network analysis using python. It involves the use of many mathematical, statistical and computational techniques. Network analysis can provide insights into the behaviour of complex systems and help to make ... Read More
213 Views
Python provides us with a variety of tools as well as libraries that help us work with the foundations of probability. Probability has a wide scale use case from AI content detection to card games. The random module is often used for probability related problem statements. This combined with libraries like numpy and scipy (and matplotlib and seaborn for visualization) can be of great advantage when the data is large scale and mainly in the form of csv files. Probability problem statements can further be clubbed with statistics to gain more insights. It doesn’t matter if you are a beginner ... Read More