久久精品国产精品国产精品污,男人扒开添女人下部免费视频,一级国产69式性姿势免费视频,夜鲁夜鲁很鲁在线视频 视频,欧美丰满少妇一区二区三区,国产偷国产偷亚洲高清人乐享,中文 在线 日韩 亚洲 欧美,熟妇人妻无乱码中文字幕真矢织江,一区二区三区人妻制服国产

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程语言 > python >内容正文

python

A Complete Tutorial on Tree Based Modeling from Scratch (in R Python)

發布時間:2025/3/21 python 28 豆豆
生活随笔 收集整理的這篇文章主要介紹了 A Complete Tutorial on Tree Based Modeling from Scratch (in R Python) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
轉載自:

http://www.analyticsvidhya.com/blog/2016/04/complete-tutorial-tree-based-modeling-scratch-in-python/

Introduction

Tree based learning algorithms are considered to be one of the best and mostly used supervised learning methods.?Tree based methods empower predictive models with high accuracy, stability and ease of interpretation.?Unlike linear models, they map non-linear relationships quite well. They are?adaptable at solving any kind of problem at hand (classification or regression).

Methods like decision trees, random forest, gradient boosting are being popularly used in all kinds of data science problems. Hence, for every analyst (fresher also), it’s important to learn these algorithms and use them for modeling.

This tutorial is meant to help beginners learn tree based modeling from scratch. After the successful completion of this tutorial, one is expected to become proficient at using tree based algorithms and?build predictive models.

Note: This tutorial requires no prior knowledge of machine learning. However, elementary knowledge of R or Python will be helpful. To get started you can follow?full tutorial in R?and?full tutorial in Python.

?

Table of Contents

  • What is a Decision Tree? How does it work?
  • Regression Trees vs Classification Trees
  • How does a tree decide where to split?
  • What are the key parameters of model building and how can we avoid over-fitting in decision trees?
  • Are tree based models better than linear models?
  • Working with Decision Trees in R and Python
  • What are the ensemble methods of trees based model?
  • What is Bagging? How does it work?
  • What is Random Forest ? How does it work?
  • What is Boosting ? How does it work?
  • Which is more powerful: GBM or Xgboost?
  • Working with GBM in R and Python
  • Working with Xgboost in R and Python
  • Where to Practice ?
  • ?

    ?

    1. What is a Decision Tree ? How does it work ?

    Decision tree?is a type of supervised learning algorithm (having a pre-defined target variable) that is mostly used in?classification problems. It works for both categorical and continuous input and output variables. In this technique, we split the population or sample into two or more homogeneous sets (or sub-populations) based on most significant splitter / differentiator in?input variables.

    Example:-

    Let’s say we have a sample of 30 students with three variables Gender (Boy/ Girl), Class( IX/ X) and Height (5 to 6 ft). 15 out of these 30 play cricket in?leisure time. Now, I want to create a model to?predict who will play cricket during leisure period? In this problem, we need to segregate students who play cricket in their leisure time based on highly significant input variable among all three.

    This is where decision tree helps, it will segregate the students based on all values of three variable and?identify the variable, which creates the?best homogeneous sets of students (which are heterogeneous to each other). In the snapshot below, you can see that variable Gender is able to identify best homogeneous sets compared to the other two variables.

    As mentioned above, decision tree identifies the most significant variable and it’s value that gives best homogeneous sets of population. Now the question which arises is, how does it identify the variable and the split? To do?this, decision tree uses various algorithms, which we will shall discuss in the following section.

    ?

    Types of Decision Trees

    Types of decision tree is based on the type of target variable we have. It can be of two types:

  • Categorical?Variable Decision Tree:?Decision Tree which has categorical target variable then it called as categorical?variable decision tree. Example:- In above scenario of student problem, where the target variable was “Student will play cricket or not” i.e. YES?or NO.
  • Continuous Variable Decision Tree:?Decision Tree has continuous?target variable then it is called as Continuous?Variable Decision Tree.
  • Example:-?Let’s say we have a problem to predict whether a customer will pay his renewal premium with an insurance company?(yes/ no). Here we know that income of customer is a?significant variable but insurance company does not have income details for all customers. Now, as we know this?is an important variable, then we can build a decision tree to predict customer income based on occupation, product and various other variables. In this case, we are predicting values for continuous variable.

    ?

    Important Terminology related to?Decision Trees

    Let’s look at the basic terminology used with Decision trees:

  • Root Node:?It represents entire population or sample and this further gets divided into two or more homogeneous sets.
  • Splitting:?It is a process of dividing a node into two or more sub-nodes.
  • Decision Node:?When a sub-node splits into further sub-nodes, then it is called decision node.
  • Leaf/ Terminal Node:?Nodes do not split is called Leaf or Terminal node.
  • Pruning:?When we remove sub-nodes of a decision node, this process is called pruning. You can say opposite process of splitting.
  • Branch / Sub-Tree:?A sub section of entire tree is called branch or sub-tree.
  • Parent and Child Node:?A node, which is divided into sub-nodes is called parent node of sub-nodes where as sub-nodes are?the child of parent node.
  • These are the terms commonly used for decision trees. As we know that every algorithm has advantages and disadvantages, below are the important factors which one should know.

    ?

    Advantages

  • Easy to Understand:?Decision tree output is?very easy to understand even for people from non-analytical background. It does not require any statistical knowledge to read and interpret them. Its graphical representation is very intuitive?and users can easily relate their hypothesis.
  • Useful in Data exploration:?Decision tree is one of the fastest way to identify most significant variables and relation between two or more variables. With the help of decision trees, we can create new variables / features that has better power to predict target variable. You can refer article (Trick to enhance power of regression model) for one such trick.? It can also be used in data exploration stage. For example, we are working on a problem where we have information available in hundreds of variables, there decision tree will help to identify most significant variable.
  • Less data cleaning required:?It requires less data cleaning compared to some?other modeling techniques.?It is not influenced by outliers and missing values to a fair degree.
  • Data type is not a constraint:?It can handle both numerical and categorical variables.
  • Non Parametric Method:?Decision tree is?considered to be a non-parametric method. This means?that decision trees have no assumptions about the space distribution and?the classifier structure.
  • ?

    Disadvantages

  • Over fitting:?Over fitting is one of the most practical difficulty for decision tree models. This problem gets solved by?setting constraints on model parameters and pruning (discussed in detailed below).
  • Not fit for continuous variables: While working with continuous numerical variables, decision tree looses information when?it categorizes variables in different categories.
  • ?

    ?

    2. Regression Trees vs Classification Trees

    We all know that the terminal nodes (or leaves) lies at the bottom of the decision tree. This means that decision trees are typically drawn upside down such that leaves?are the the bottom & roots are the tops (shown below).

    Both the trees work almost similar to each other, let’s look at the primary differences & similarity?between classification and regression trees:

  • Regression trees are used when dependent variable is continuous. Classification trees are used when dependent variable is categorical.
  • In case of regression tree, the value obtained by terminal nodes in the training data is the mean response of observation falling in that region. Thus, if an unseen data observation falls in that region, we’ll make its prediction with?mean value.
  • In case of classification tree, the value (class) obtained by terminal node in the training data is the mode of observations falling in that region. Thus, if an unseen data observation falls in that region, we’ll make its prediction with mode value.
  • Both the trees divide the predictor space (independent variables) into distinct and non-overlapping regions. For the sake of simplicity, you can think of these regions as high dimensional boxes or boxes.
  • Both the trees follow a top-down greedy approach known as recursive binary splitting. We call it as ‘top-down’ because it begins from the top of tree when all the observations are available in a single region and successively splits the predictor space into two new branches down the tree. It is known as ‘greedy’ because, the algorithm cares (looks for best variable available) about only the current split, and not about future splits which will lead to a better tree.
  • This splitting process is continued?until a user defined stopping criteria is reached. For example: we can tell the the algorithm?to?stop once the number of observations per node becomes less than 50.
  • In both the cases, the splitting process results in fully grown trees until the stopping criteria is reached.But, the fully grown tree is likely to overfit data, leading to poor accuracy on unseen data. This bring ‘pruning’. Pruning is one of the technique used tackle overfitting. We’ll learn more about it in following section.
  • ?

    ?

    3. How does a tree decide where?to split?

    The decision of making strategic splits heavily affects a tree’s accuracy. The decision criteria is different for classification and regression trees.

    Decision trees use multiple?algorithms to decide to split a node in two or more sub-nodes. The creation of sub-nodes increases the homogeneity of resultant sub-nodes. In other words, we can say that purity of the node increases with respect to the target variable. Decision tree splits the nodes on all available variables and then selects the split which results in most homogeneous sub-nodes.

    The algorithm selection is also based on type of target variables. Let’s look at the four most commonly?used algorithms in decision tree:

    ?

    Gini Index

    Gini index says, if we select two items from a population at random then they must be of same class and probability for this is 1 if population is pure.

  • It works with categorical target variable “Success” or “Failure”.
  • It performs only Binary splits
  • Higher the value of Gini higher the homogeneity.
  • CART (Classification and Regression Tree) uses Gini method to create binary splits.
  • Steps to?Calculate Gini for a split

  • Calculate Gini for sub-nodes, using formula sum of square of probability for success and failure (p^2+q^2).
  • Calculate Gini for split using weighted Gini score of each node of that split
  • Example: –?Referring to example used above, where we want to segregate the students based on target variable ( playing cricket or not ). In the?snapshot below, we split the population using two input variables Gender and Class. Now, I want to identify which split is producing more homogeneous sub-nodes using Gini index.

    Split on Gender:

  • Calculate, Gini for sub-node Female = (0.2)*(0.2)+(0.8)*(0.8)=0.68
  • Gini for sub-node Male = (0.65)*(0.65)+(0.35)*(0.35)=0.55
  • Calculate weighted Gini for Split Gender = (10/30)*0.68+(20/30)*0.55 =?0.59
  • Similar for Split on Class:

  • Gini for sub-node Class IX = (0.43)*(0.43)+(0.57)*(0.57)=0.51
  • Gini for sub-node Class X = (0.56)*(0.56)+(0.44)*(0.44)=0.51
  • Calculate weighted Gini for Split Class?= (14/30)*0.51+(16/30)*0.51 =?0.51
  • Above, you can see that Gini score for?Split on Gender?is higher than?Split on Class,?hence, the node split will take place on Gender.

    ?

    Chi-Square

    It is an algorithm to find out the statistical significance between the differences between sub-nodes and parent node. We measure it by?sum of squares of standardized?differences between observed and expected frequencies?of target variable.

  • It works with categorical target variable “Success” or “Failure”.
  • It can perform two or more splits.
  • Higher the value of Chi-Square higher the statistical significance of differences between sub-node and Parent node.
  • Chi-Square of each node is calculated using formula,
  • Chi-square = ((Actual – Expected)^2 / Expected)^1/2
  • It generates tree called CHAID (Chi-square Automatic Interaction Detector)
  • Steps to?Calculate Chi-square for a split:

  • Calculate Chi-square for individual node by calculating the deviation for Success and Failure both
  • Calculated Chi-square of Split using Sum of all Chi-square of success and Failure of each node of?the split
  • Example:?Let’s work with above example that we have used to calculate Gini.

    Split on Gender:

  • First we are populating for node Female, Populate the actual value for “Play Cricket”?and?“Not Play Cricket”, here these are 2 and 8 respectively.
  • Calculate expected value for “Play Cricket”?and “Not Play Cricket”, here it would be 5 for both because parent node has probability of 50% and we have applied same probability on Female count(10).
  • Calculate deviations by using formula, Actual – Expected. It is for “Play Cricket”?(2 – 5 = -3) and for “Not play cricket”?( 8 – 5 = 3).
  • Calculate Chi-square of node for “Play Cricket”?and “Not Play Cricket”?using formula with formula,=?((Actual – Expected)^2 / Expected)^1/2. You can refer below table for calculation.
  • Follow similar steps for calculating Chi-square value for Male node.
  • Now add all Chi-square values to calculate Chi-square for split Gender.
  • Split on Class:

    Perform similar steps of calculation for split on Class and you will come up with below table.

    Above, you can see that Chi-square?also identify the Gender split is more significant compare to Class.

    ?

    Information Gain:

    Look at the image below and think which node can be described easily. I am sure, your answer is?C because it requires less?information as all values are similar. On the other hand, B requires more information to describe it and A requires the maximum information. In other words, we can say that?C is a Pure node, B is less Impure and A is more impure.

    Now, we can build a?conclusion that less impure node requires less information to describe it. And, more impure node requires more information. Information theory is?a measure to define this degree of disorganization in a system?known as Entropy. If the sample is completely homogeneous, then the entropy is zero and if the sample is an equally divided (50% – 50%), it has entropy of one.

    Entropy can be calculated using formula:-

    Here p and q is probability of success and failure respectively in that node. Entropy is also used with categorical target variable. It chooses the split which has lowest entropy compared to parent node and other splits. The lesser the entropy, the better it is.

    Steps to calculate entropy for a split:

  • Calculate entropy of parent node
  • Calculate entropy of each individual node of split and calculate weighted average of all sub-nodes available in split.
  • Example:?Let’s use this method to identify best split for student example.

  • Entropy for parent node = -(15/30)?log2 (15/30)?– (15/30) log2 (15/30) =?1. Here 1 shows that it is a impure node.
  • Entropy for Female node = -(2/10) log2 (2/10) –?(8/10) log2 (8/10) = 0.72 and for male node,? -(13/20) log2 (13/20) –?(7/20) log2 (7/20) =?0.93
  • Entropy for?split Gender = Weighted entropy of sub-nodes = (10/30)*0.72 + (20/30)*0.93 =?0.86
  • Entropy for?Class?IX node, -(6/14) log2 (6/14) –?(8/14) log2 (8/14) = 0.99 and for Class X?node,? -(9/16) log2 (9/16) –?(7/16) log2 (7/16) = 0.99.
  • Entropy for split Class = ?(14/30)*0.99 + (16/30)*0.99 =?0.99
  • Above, you can see that entropy for?Split on Gender?is the lowest among all,?so the tree will split on?Gender. We can derive information gain from entropy as?1- Entropy.

    ?

    Reduction in Variance

    Till now, we have discussed the algorithms for categorical target variable. Reduction in variance is an algorithm used for?continuous?target variables (regression problems). This algorithm uses the standard formula?of variance to choose the best?split. The split with lower variance is selected as the?criteria to split the population:

    Above X-bar is mean of the values, X is actual and n is number of values.

    Steps to calculate Variance:

  • Calculate variance for each node.
  • Calculate variance for each split as weighted average of each node variance.
  • Example:-?Let’s assign numerical value 1 for play cricket and 0 for not playing cricket. Now follow the steps to identify the right split:

  • Variance for Root node, here mean value is (15*1 + 15*0)/30 = 0.5 and we have 15 one and 15 zero. Now variance would be ((1-0.5)^2+(1-0.5)^2+….15 times+(0-0.5)^2+(0-0.5)^2+…15 times) / 30, this can be written as (15*(1-0.5)^2+15*(0-0.5)^2) / 30 =?0.25
  • Mean of Female node = ?(2*1+8*0)/10=0.2 and?Variance = (2*(1-0.2)^2+8*(0-0.2)^2) / 10?= 0.16
  • Mean of Male Node = (13*1+7*0)/20=0.65?and?Variance = (13*(1-0.65)^2+7*(0-0.65)^2) / 20?= 0.23
  • Variance for Split Gender = Weighted Variance of Sub-nodes = (10/30)*0.16 + (20/30) *0.23 =?0.21
  • Mean of Class IX node =??(6*1+8*0)/14=0.43 and Variance = (6*(1-0.43)^2+8*(0-0.43)^2) / 14= 0.24
  • Mean of Class X node = ?(9*1+7*0)/16=0.56 and Variance = (9*(1-0.56)^2+7*(0-0.56)^2) / 16 = 0.25
  • Variance for Split Gender = (14/30)*0.24 + (16/30) *0.25 =?0.25
  • Above, you can see that Gender split has lower variance compare to parent node, so the split would take place on?Gender?variable.

    Until here, we learnt about the basics of decision trees and the decision making process involved to choose the best splits in building a tree model. As I said, decision tree can be applied both on regression and classification problems. Let’s understand these aspects in detail.

    ?

    ?

    4. What are the key parameters of tree modeling?and how can we avoid over-fitting in decision trees?

    Overfitting is one of the key challenges faced while modeling decision trees. If?there is no limit set of a decision tree, it will give you 100% accuracy on training set because?in the worse case it will end up making 1 leaf for each observation. Thus, preventing overfitting is?pivotal while modeling a decision tree and it can be done in 2 ways:

  • Setting constraints on tree size
  • Tree pruning
  • Lets discuss both of these briefly.

    Setting Constraints on Tree Size

    This can be done by using various parameters which are used to define a tree.?First, lets look at the general structure of a decision tree:

    The parameters used for defining a tree are further explained below. The parameters described below are irrespective of tool. It is?important to understand?the role?of parameters used in tree modeling. These parameters are available in R & Python.

  • Minimum samples for a node split
    • Defines the minimum number of samples (or observations) which are required in a node to be considered for splitting.
    • Used to control over-fitting. Higher values prevent a model from learning relations which might be highly?specific to the?particular sample selected for a tree.
    • Too high values can lead to under-fitting hence, it should be tuned using CV.
  • Minimum samples for a terminal node (leaf)
    • Defines the minimum samples (or observations) required in a terminal node or leaf.
    • Used to control over-fitting similar to min_samples_split.
    • Generally lower values should be chosen for imbalanced class problems because the regions in which the minority class will be in majority will be very small.
  • Maximum depth of tree (vertical depth)
    • The maximum depth of a tree.
    • Used to control over-fitting as higher depth will allow model to learn relations very specific to a particular sample.
    • Should be tuned using CV.
  • Maximum number of terminal nodes
    • The maximum number of terminal nodes or leaves in a tree.
    • Can be defined in place of?max_depth. Since binary trees are created, a depth of ‘n’ would produce a maximum of 2^n leaves.
  • Maximum features to consider for split
    • The number of features to consider while searching for a best split. These will be randomly selected.
    • As a thumb-rule, square root of the total number of features works great but we should check upto 30-40% of the total number of features.
    • Higher values can lead to over-fitting but depends on case to case.
  • ?

    Tree Pruning

    As discussed earlier, the technique of setting constraint is a?greedy-approach. In other words, it will check for the best split instantaneously and move forward until one of the specified stopping condition is reached. Let’s consider the following?case when you’re driving:

    There are 2 lanes:

  • A lane with cars moving at 80km/h
  • A lane with trucks moving at 30km/h
  • At this instant, you are the yellow car and you have 2 choices:

  • Take a left and overtake the other 2 cars quickly
  • Keep moving in the present lane
  • Lets analyze these choice. In the former choice, you’ll immediately?overtake the car ahead and reach?behind the truck and start moving at 30 km/h, looking for an opportunity to move back right. All cars originally behind you move ahead in the meanwhile. This would be the optimum choice if your objective is to maximize the distance covered in next say 10 seconds. In the later choice, you sale through at same speed, cross trucks and then overtake maybe depending on situation ahead. Greedy you!

    This is exactly the difference between normal decision tree & pruning. A decision tree with constraints won’t see the truck ahead and adopt a greedy approach by taking a left. On the other hand if we use pruning, we in effect look at a few steps ahead and make a choice.

    So we know pruning is better. But how?to implement it in decision tree? The idea is simple.

  • We first make the decision tree to a large depth.
  • Then we start at the bottom and start removing leaves which are giving us negative returns when compared from the top.
  • Suppose?a split is giving us a gain of say -10 (loss of 10) and then the next split on that gives us a gain of 20. A simple decision tree will stop at step 1 but in pruning, we will see that the overall gain is +10 and keep both leaves.
  • Note that sklearn’s decision tree classifier does not currently?support?pruning.?Advanced packages like xgboost have adopted?tree pruning in their implementation.?But the library?rpart?in R, provides a function to prune. Good for R users!

    ?

    5. Are tree based models better than linear models?

    “If I can use logistic regression for classification problems and linear regression for regression problems, why is there a need to use trees”? Many of us have this question. And, this is a valid one too.

    Actually, you can use any algorithm. It is dependent on the type of problem you are solving. Let’s look at some key factors which will help you to decide which algorithm to use:

  • If the relationship between dependent & independent variable is well approximated by a linear model, linear regression will outperform tree based model.
  • If there is a high non-linearity & complex relationship between dependent & independent variables, a tree model will outperform a classical regression method.
  • If you need to build a model which is easy to explain to people, a decision tree model?will always do better than a linear model. Decision tree models?are even simpler to interpret than linear regression!
  • ?

    ?

    6. Working with Decision Trees in R and Python

    For R users and Python users, decision tree is quite easy to implement. Let’s quickly look at the set of codes which can get you started with this algorithm. For ease of use, I’ve shared standard codes where you’ll need to replace your data set name and variables to get started.

    For R users, there are multiple packages available to implement decision tree such as ctree, rpart, tree etc.

    > library(rpart) > x <- cbind(x_train,y_train) # grow tree? > fit <- rpart(y_train ~ ., data = x,method="class") > summary(fit) #Predict Output > predicted= predict(fit,x_test)

    In the code above:

    • y_train – represents dependent variable.
    • x_train – represents independent variable
    • x – represents training data.

    ?

    For Python users, below is the code:

    #Import Library #Import other necessary libraries like pandas, numpy... from sklearn import tree #Assumed you have, X (predictor) and Y (target) for training data set and x_test(predictor) of test_dataset # Create tree object model = tree.DecisionTreeClassifier(criterion='gini') # for classification, here you can change the algorithm as gini or entropy (information gain) by default it is gini # model = tree.DecisionTreeRegressor() for regression # Train the model using the training sets and check score model.fit(X, y) model.score(X, y) #Predict Output predicted= model.predict(x_test)

    ?

    ?

    7. What are ensemble methods in tree based modeling ?

    The literary meaning of word ‘ensemble’ is?group. Ensemble methods involve group of predictive models to achieve a better accuracy and model stability. Ensemble methods are known to impart supreme boost to tree based?models.

    Like every other model, a tree based model also suffers from the plague of bias and variance. Bias means, ‘how much on an average are the predicted values different from the actual value.’ Variance means, ‘how different will the predictions of the model be at the same point if different samples are?taken from the same population’.

    You build a small tree and you will get a model with low variance and high bias. How do you manage to balance the trade off between bias and variance ?

    Normally, as you increase the complexity of your model, you will see a reduction in prediction error due to lower bias in the model. As you continue to make your model more complex, you end up over-fitting your model and your model will start suffering from high variance.

    A champion model should maintain a balance between these two types of errors. This is known as thetrade-off management?of bias-variance errors.?Ensemble learning is one way to execute this trade off analysis.

    Some of the commonly used ensemble methods include: Bagging, Boosting and Stacking. In this tutorial, we’ll focus on Bagging and Boosting in detail.

    ?

    ?

    8. What is Bagging? How does it work?

    Bagging?is a technique used to reduce the variance of our predictions?by combining?the result?of multiple?classifiers modeled on different sub-samples of the same data set. The following figure will make it clearer:


    The steps followed in bagging are:

  • Create Multiple DataSets:
    • Sampling is done?with replacement?on the original data and new datasets are formed.
    • The new data sets can have a fraction of the columns as well as rows, which are generally hyper-parameters in a bagging model
    • Taking row and column fractions less than 1 helps in making robust models, less prone to overfitting
  • Build Multiple Classifiers:
    • Classifiers are built on each data set.
    • Generally the same classifier is modeled on each data set and predictions are made.
  • Combine Classifiers:
    • The predictions of all the classifiers are combined using a mean, median or mode value depending on the problem at hand.
    • The combined values are generally more robust than a single model.
  • Note that, here?the number of models built is not a hyper-parameters.?Higher number of models are always better or may give similar?performance than lower numbers. It can be theoretically shown that the variance of the combined predictions are reduced to 1/n (n: number of classifiers) of the original variance, under some assumptions.

    There are various implementations of bagging models. Random forest is one of them and we’ll discuss it next.

    ?

    ?

    9. What is Random Forest ? How does it work?

    Random Forest is considered to be a?panacea?of all data science problems. On a funny note, when you can’t think of any algorithm (irrespective of situation), use random forest!

    Random Forest?is a versatile machine learning method capable of performing both regression and classification tasks.?It also undertakes?dimensional reduction methods, treats missing values, outlier values?and other essential?steps of data exploration,?and does a fairly good job.?It is?a type of ensemble learning method, where a group of weak models combine?to form a powerful model.

    ?

    How does it work?

    In Random Forest, we grow?multiple?trees as opposed?to a single tree in CART model (see comparison between CART and Random Forest here,?part1?and?part2).?To classify a new object based on attributes, each tree gives a classification and we say the tree “votes” for that class. The forest chooses the classification having the most votes (over all the trees in the forest) and in case of regression, it takes the average of outputs by different trees.

    It works in the following manner.?Each tree is planted & grown as follows:

  • Assume number of cases in the training set is N. Then, sample of these N cases is taken at random butwith replacement. This sample will be the training set for growing the tree.
  • If there are M input variables, a number m<M is specified such that at each node, m variables are selected at random out of the M. The best split on these m is used to split the node. The value of m is held constant while we grow?the forest.
  • Each tree is grown to the largest extent possible and ?there is no pruning.
  • Predict new data by aggregating the predictions of the ntree trees (i.e., majority votes for classification, average for regression).
  • To?understand more in detail about this algorithm using a case study, please read this?article “Introduction to Random forest – Simplified“.

    ?

    Advantages of Random Forest

    • This algorithm can solve both type of problems i.e. classification and regression and does a decent estimation at both fronts.
    • One of benefits of Random forest which?excites me most is, the power of handle large data set with higher dimensionality. It can handle thousands of input variables and identify most significant variables so it is considered as one of the dimensionality reduction methods. Further, the model outputs?Importance of variable,?which can be a very handy feature (on some random data set).
    • It has an effective method for estimating missing data and maintains accuracy when a large proportion of the data are missing.
    • It has methods for balancing errors in data sets where classes are imbalanced.
    • The capabilities of the above can be extended to unlabeled data, leading to unsupervised clustering, data views and outlier detection.
    • Random Forest involves sampling of the input data with replacement called as bootstrap sampling. Here?one third of the data is not used for training and can be used to testing. These are called the?out of bag?samples. Error estimated on these out of bag samples is known as?out of bag error. Study of error estimates by Out of bag, gives evidence to show that the out-of-bag estimate is as accurate as using a test set of the same size as the training set. Therefore, using the out-of-bag error estimate removes the need for a set aside test set.

    ?

    Disadvantages of Random Forest

    • It surely does a?good job at?classification but not as good as for regression problem as it does not give precise continuous nature predictions. In case of regression, it?doesn’t?predict beyond the range in the training data, and that they may over-fit data sets that are particularly noisy.
    • Random Forest can feel like a black box approach for statistical modelers – you have very little control on what the model does. You can at best – try different parameters and random seeds!

    ?

    Python & R implementation

    Random forests have commonly known implementations in R packages and Python scikit-learn. Let’s look at the code?of loading random forest model in R and Python below:

    Python

    #Import Library from sklearn.ensemble import RandomForestClassifier #use RandomForestRegressor for regression problem #Assumed you have, X (predictor) and Y (target) for training data set and x_test(predictor) of test_dataset # Create Random Forest object model= RandomForestClassifier(n_estimators=1000) # Train the model using the training sets and check score model.fit(X, y) #Predict Output predicted= model.predict(x_test)

    ?

    R?Code

    > library(randomForest) > x <- cbind(x_train,y_train) # Fitting model > fit <- randomForest(Species ~ ., x,ntree=500) > summary(fit) #Predict Output > predicted= predict(fit,x_test)

    ?

    ?

    10. What is Boosting ? How does it work?

    Definition:?The term ‘Boosting’ refers to a family of algorithms which?converts weak learner to strong learners.

    Let’s understand this definition in detail by solving a problem of spam email identification:

    How would you classify?an email as SPAM or not? Like everyone else, our initial approach would be?to identify ‘spam’ and ‘not spam’ emails using following criteria. If:

  • Email has only one?image file (promotional image), It’s a SPAM
  • Email has only link(s), It’s a SPAM
  • Email body consist of?sentence like “You won a prize money of $ xxxxxx”, It’s a SPAM
  • Email from our?official domain “Analyticsvidhya.com” , Not a SPAM
  • Email from known source, Not a SPAM
  • Above, we’ve defined multiple rules to classify?an email into ‘spam’ or ‘not spam’.?But, do you think these rules individually are strong enough to successfully classify?an email? No.

    Individually, these rules are?not powerful enough to classify an email into ‘spam’ or ‘not spam’.?Therefore, these rules are called as?weak learner.

    To convert weak learner to strong learner, we’ll combine the prediction of each weak learner using methods like:

    • Using average/ weighted average
    • Considering prediction has higher vote

    For example: ?Above,?we have defined 5 weak learners. Out of these 5, 3 are?voted as?‘SPAM’ and 2 are voted as ‘Not a SPAM’. In this case, by default, we’ll consider an email as SPAM because we?have higher(3) vote for ‘SPAM’.

    ?

    How does it work?

    Now we know that, boosting combines weak learner a.k.a. base learner to form a strong rule. An immediate question which should pop in your mind is, ‘How boosting identify weak rules?‘

    To find weak rule, we apply?base learning (ML) algorithms with a different distribution. Each time base learning algorithm is applied, it generates a new weak prediction rule. This is an iterative process. After many iterations, the boosting algorithm combines these weak rules into a single strong prediction rule.

    Here’s another question which might haunt you, ‘How do we choose different distribution for each round?’

    For choosing the right distribution, here are the following steps:

    Step 1:??The?base learner takes all the distributions and assign equal weight or attention to each observation.

    Step 2:?If there is any prediction error caused by first base learning algorithm, then we pay higher attention to observations having prediction error. Then, we?apply the next base learning algorithm.

    Step 3:?Iterate Step 2 till the limit of base learning algorithm is reached or higher accuracy is achieved.

    Finally, it combines the outputs from?weak learner and creates ?a strong learner which eventually improves the prediction power of the model. Boosting pays?higher?focus on examples which are mis-classi?ed or have higher errors by preceding weak rules.
    There are many boosting algorithms which impart additional boost to model’s accuracy. In this tutorial, we’ll learn about the two most commonly used algorithms i.e. Gradient Boosting (GBM) and XGboost.

    ?

    ?

    11. Which is more powerful:?GBM or Xgboost?

    I’ve always admired the boosting capabilities that xgboost algorithm. At times, I’ve found that it provides?better result compared to GBM implementation, but at times you might find that the gains are just marginal. When I explored more about its performance and science behind its high accuracy, I discovered many advantages of Xgboost over GBM:

  • Regularization:
    • Standard GBM implementation has no?regularization?like?XGBoost,?therefore?it also helps to reduce overfitting.
    • In fact, XGBoost is also known as?‘regularized boosting‘ technique.
  • Parallel Processing:
    • XGBoost implements parallel processing and is?blazingly faster?as compared to GBM.
    • But hang on, we know that?boosting?is sequential process so how can it be parallelized? We know that each tree can be built only after the previous one, so?what stops us from making a tree using all cores? I hope you?get?where I’m coming from. Check?this link?out to explore further.
    • XGBoost also supports implementation on Hadoop.
  • High Flexibility
    • XGBoost allow users to define?custom optimization objectives and evaluation criteria.
    • This adds a whole new dimension to the model and there is no limit to what we can do.
  • Handling Missing Values
    • XGBoost has an in-built routine to handle?missing values.
    • User is required to?supply?a different value than other observations and pass that as a parameter. XGBoost?tries different things as it encounters a missing value on each node and learns which path to take for missing values in future.
  • Tree Pruning:
    • A GBM would stop splitting a node when it encounters a negative loss in the split. Thus it is more of a?greedy algorithm.
    • XGBoost on the other hand make?splits upto the max_depth?specified and then start?pruningthe tree backwards and remove splits beyond which there is no positive gain.
    • Another?advantage is that sometimes a split of negative loss say -2 may be followed by a split of positive loss +10. GBM would stop as it encounters -2. But XGBoost will go deeper and it will see a combined effect of +8 of the split and keep both.
  • Built-in Cross-Validation
    • XGBoost allows user to run a?cross-validation at each iteration?of the boosting process and thus it is easy to get the exact optimum number of boosting iterations in a single run.
    • This is unlike GBM where we have to run a grid-search and only a limited values can be tested.
  • Continue on Existing Model
    • User can start training an XGBoost model from its last iteration of previous run. This can be of significant advantage in certain specific applications.
    • GBM implementation of sklearn also has this feature so they are even on this point.
  • ?

    ?

    12. Working with GBM in R and Python

    Before we start working, let’s quickly understand the important parameters and the working of this?algorithm. This will be helpful for both R and Python users. Below is the overall pseudo-code of GBM algorithm for 2 classes:

    1. Initialize the outcome 2. Iterate from 1 to total number of trees2.1 Update the weights for targets based on previous run (higher for the ones mis-classified)2.2 Fit the model on selected subsample of data2.3 Make predictions on the full set of observations2.4 Update the output with current results taking into account the learning rate 3. Return the final output.

    This is an extremely simplified (probably naive) explanation of GBM’s working. But, it will help every beginners to understand this algorithm.

    Lets consider?the important GBM?parameters used to improve model performance in Python:

  • learning_rate
    • This determines the impact of each tree on the final outcome (step 2.4). GBM works by starting with an initial estimate which is updated using the output of each tree. The learning parameter controls the magnitude of this change in the estimates.
    • Lower values are generally preferred as they?make the model?robust to the specific characteristics of tree and thus allowing it to generalize well.
    • Lower values would require higher number of trees to model all the relations and will be computationally expensive.
  • n_estimators
    • The number of sequential trees to be modeled (step 2)
    • Though GBM is fairly robust at?higher number of trees but it can still overfit at a point. Hence, this should be tuned using?CV for a particular learning rate.
  • subsample
    • The fraction of observations to be selected for each tree. Selection is done by random sampling.
    • Values slightly less than 1 make the model robust by reducing the variance.
    • Typical values ~0.8 generally work fine but can be fine-tuned further.
  • Apart from these, there are certain miscellaneous parameters which affect overall functionality:

  • loss
    • It refers to the loss function to be minimized in each split.
    • It can have various values for classification and regression case. Generally the default values work fine. Other values should be chosen only if you?understand their impact on the model.
  • init
    • This affects initialization of the output.
    • This can be used if we have made another model whose outcome is?to be used as the initial estimates for GBM.
  • random_state
    • The random number seed so that same random numbers are generated every time.
    • This is important for parameter tuning. If we don’t fix the random number, then we’ll have different outcomes for subsequent runs on the same parameters and it becomes difficult to compare models.
    • It can potentially result in overfitting to a particular random sample selected. We can try running models for different random samples, which is computationally expensive and generally not used.
  • verbose
    • The type of output?to be printed when?the model fits. The different values can be:
      • 0: no output generated (default)
      • 1: output generated for trees in certain intervals
      • >1: output generated?for all trees
  • warm_start
    • This parameter has an interesting application?and can help a lot if used judicially.
    • Using this, we can fit additional trees on previous fits of a model. It can save a lot of time and you should explore this option for advanced applications
  • presort?
    • ?Select whether to presort data for faster splits.
    • It makes the selection automatically by default but it can be changed if needed.
  • I know its a long list of parameters but I have simplified it for you in?an excel file which you can download from this?GitHub repository.

    For R users, using caret package, there are 3 main tuning parameters:

  • n.trees?– It refers to number of iterations i.e. tree which will be taken to grow the trees
  • interaction.depth?– It determines the complexity of the tree i.e. total number of splits it has to perform on a tree (starting from a single node)
  • ?shrinkage?– It refers to the learning rate. This is similar to learning_rate in python (shown above).
  • n.minobsinnode – It refers to minimum number of training samples required in a node to perform splitting
  • ?

    GBM in R (with cross validation)

    I’ve shared the standard codes in R and Python. At your end, you’ll be required to change the value of dependent variable and data set name used in the codes below. Considering the ease of implementing GBM in R, one can easily perform tasks like cross validation and grid search with this package.

    > library(caret) > fitControl <- trainControl(method = "cv",number = 10, #5folds) > tune_Grid <- expand.grid(interaction.depth = 2,n.trees = 500,shrinkage = 0.1,n.minobsinnode = 10) > set.seed(825) > fit <- train(y_train ~ ., data = train,method = "gbm",trControl = fitControl,verbose = FALSE,tuneGrid = gbmGrid) > predicted= predict(fit,test,type= "prob")[,2]

    ?

    GBM in Python

    #import libraries from sklearn.ensemble import GradientBoostingClassifier #For Classification from sklearn.ensemble import GradientBoostingRegressor #For Regression #use GBM function clf = GradientBoostingClassifier(n_estimators=100, learning_rate=1.0, max_depth=1) clf.fit(X_train, y_train)

    ?

    ?

    13. Working with XGBoost in R and Python

    XGBoost (eXtreme Gradient Boosting)?is an advanced implementation of gradient boosting algorithm. It’s feature to implement parallel computing makes it?at least?10 times faster?than existing gradient boosting implementations. It supports various objective functions, including regression, classification and ranking.

    R Tutorial:?For R users, this is a complete tutorial on XGboost which explains the parameters along with codes in R.?Check Tutorial.

    Python Tutorial: For Python users, this is a comprehensive tutorial on XGBoost, good to get you started.?Check Tutorial.

    ?

    ?

    14. Where to practice ?

    Practice is the one and true method of mastering any concept. Hence, you need to?start practicing?if you wish to master these algorithms.

    Till here, you’ve got gained significant knowledge on tree based models along with these practical implementation. It’s time that you start working on them. Here are open practice problems where you can participate and check your live rankings on leaderboard:

    For Regression:?Big Mart Sales Prediction

    For Classification:?Loan Prediction

    ?

    End Notes

    Tree based algorithm are important for every data scientist to learn. In fact, tree models are known to provide the best model performance in the family of whole machine learning algorithms. In this tutorial, we learnt until GBM and XGBoost. And with this, we come to the end of this tutorial.

    We discussed about tree based modeling from scratch. We learnt the important of decision tree and how that simplistic concept is being used in boosting algorithms. For better understanding, I would suggest you to continue practicing these algorithms practically. Also, do keep note of the parameters associated with boosting algorithms. I’m hoping that this tutorial would enrich you with complete knowledge on tree based modeling.

    Did you find this tutorial useful ? If you have experienced, what’s the best trick you’ve used while using tree based models ? Feel free to share your tricks, suggestions and opinions in the comments section below.


    總結

    以上是生活随笔為你收集整理的A Complete Tutorial on Tree Based Modeling from Scratch (in R Python)的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

    中文毛片无遮挡高清免费 | 欧美国产日韩亚洲中文 | 免费观看的无遮挡av | 午夜精品久久久久久久久 | 一本色道久久综合狠狠躁 | 狠狠色丁香久久婷婷综合五月 | 精品国产精品久久一区免费式 | 国产精品理论片在线观看 | 亚洲熟妇色xxxxx欧美老妇 | 天堂一区人妻无码 | 国产真实伦对白全集 | 国产黑色丝袜在线播放 | 亚洲中文字幕乱码av波多ji | 亚洲另类伦春色综合小说 | 精品少妇爆乳无码av无码专区 | 女人被男人躁得好爽免费视频 | 国产特级毛片aaaaaa高潮流水 | 亚洲精品无码人妻无码 | 国产热a欧美热a在线视频 | 沈阳熟女露脸对白视频 | 香蕉久久久久久av成人 | 久久久精品人妻久久影视 | 久久 国产 尿 小便 嘘嘘 | 在线看片无码永久免费视频 | 亚洲国产精品毛片av不卡在线 | 免费看少妇作爱视频 | 亚洲国产欧美在线成人 | 亚洲中文字幕乱码av波多ji | 大肉大捧一进一出视频出来呀 | 国产亲子乱弄免费视频 | 两性色午夜视频免费播放 | 伊人久久大香线焦av综合影院 | 亚洲国产欧美日韩精品一区二区三区 | 国产色xx群视频射精 | 亚洲日本一区二区三区在线 | 久久国产精品萌白酱免费 | 精品久久8x国产免费观看 | 一本久道久久综合狠狠爱 | 国产成人人人97超碰超爽8 | 亚洲精品久久久久久一区二区 | 亲嘴扒胸摸屁股激烈网站 | 性开放的女人aaa片 | 人人妻人人澡人人爽欧美一区九九 | 久久99久久99精品中文字幕 | 国产乱子伦视频在线播放 | 精品国产乱码久久久久乱码 | 99久久无码一区人妻 | 无码国产乱人伦偷精品视频 | 国产免费久久久久久无码 | 婷婷五月综合激情中文字幕 | 国产香蕉97碰碰久久人人 | aa片在线观看视频在线播放 | 国产亚洲精品久久久久久久久动漫 | 亚洲精品中文字幕久久久久 | 成人毛片一区二区 | 色诱久久久久综合网ywww | 亚洲国产高清在线观看视频 | 在线播放免费人成毛片乱码 | 性欧美大战久久久久久久 | 无码精品国产va在线观看dvd | 青春草在线视频免费观看 | 乌克兰少妇xxxx做受 | 在线播放亚洲第一字幕 | 成人亚洲精品久久久久软件 | 日韩av无码一区二区三区 | 色妞www精品免费视频 | 撕开奶罩揉吮奶头视频 | 18禁黄网站男男禁片免费观看 | 久久久久成人精品免费播放动漫 | 人妻与老人中文字幕 | 国产精品久久久久无码av色戒 | 亚洲日韩av片在线观看 | 欧美阿v高清资源不卡在线播放 | 欧美丰满少妇xxxx性 | 亚洲综合久久一区二区 | 中国女人内谢69xxxx | 国产国语老龄妇女a片 | 国内丰满熟女出轨videos | 人人妻人人藻人人爽欧美一区 | 99久久无码一区人妻 | 亚洲国产精华液网站w | 熟女俱乐部五十路六十路av | 国产成人无码一二三区视频 | 亚洲乱码国产乱码精品精 | 无码国产激情在线观看 | 久久亚洲精品成人无码 | 欧美自拍另类欧美综合图片区 | 日韩人妻无码一区二区三区久久99 | 爽爽影院免费观看 | 久久天天躁狠狠躁夜夜免费观看 | 狠狠躁日日躁夜夜躁2020 | 欧美成人免费全部网站 | 亚洲综合色区中文字幕 | 人妻少妇精品视频专区 | 亚洲一区二区三区播放 | 久久精品国产99久久6动漫 | 亚洲经典千人经典日产 | 女人被爽到呻吟gif动态图视看 | 国产69精品久久久久app下载 | 麻豆蜜桃av蜜臀av色欲av | 麻豆av传媒蜜桃天美传媒 | 日韩精品乱码av一区二区 | 麻豆国产97在线 | 欧洲 | 亚洲日韩中文字幕在线播放 | 精品无人区无码乱码毛片国产 | 好男人社区资源 | 欧美野外疯狂做受xxxx高潮 | 99久久人妻精品免费一区 | 亚洲国产一区二区三区在线观看 | 国产精华av午夜在线观看 | 国产精品亚洲五月天高清 | av无码不卡在线观看免费 | 中文字幕乱码人妻二区三区 | 欧美高清在线精品一区 | 精品国产精品久久一区免费式 | 国语自产偷拍精品视频偷 | 国产精品无套呻吟在线 | 人妻天天爽夜夜爽一区二区 | 亚洲国产av精品一区二区蜜芽 | av无码不卡在线观看免费 | 日本又色又爽又黄的a片18禁 | 国产精品久久福利网站 | 国产成人一区二区三区别 | 大肉大捧一进一出好爽视频 | 日韩在线不卡免费视频一区 | 一个人看的视频www在线 | 欧美一区二区三区视频在线观看 | 野外少妇愉情中文字幕 | 免费人成网站视频在线观看 | 亚洲精品综合五月久久小说 | 久久国产精品偷任你爽任你 | 麻豆果冻传媒2021精品传媒一区下载 | 国产精品无套呻吟在线 | 强辱丰满人妻hd中文字幕 | 一本久道高清无码视频 | 美女张开腿让人桶 | 中国女人内谢69xxxx | 欧美日韩久久久精品a片 | 久久zyz资源站无码中文动漫 | 99久久精品无码一区二区毛片 | 国产精品人人妻人人爽 | 久久综合久久自在自线精品自 | 色婷婷久久一区二区三区麻豆 | 中文字幕无码av波多野吉衣 | 国产成人无码a区在线观看视频app | 久久久久亚洲精品男人的天堂 | 欧美黑人性暴力猛交喷水 | 97色伦图片97综合影院 | 国产成人精品无码播放 | 日日夜夜撸啊撸 | 无码人妻出轨黑人中文字幕 | 国产av一区二区精品久久凹凸 | 久久五月精品中文字幕 | 国产av人人夜夜澡人人爽麻豆 | 日本www一道久久久免费榴莲 | 中文字幕无码视频专区 | 久久久成人毛片无码 | 丰满人妻精品国产99aⅴ | 伊人色综合久久天天小片 | 九月婷婷人人澡人人添人人爽 | 少妇无套内谢久久久久 | 亚洲成色在线综合网站 | 久久人人97超碰a片精品 | 久久伊人色av天堂九九小黄鸭 | 亚洲国产日韩a在线播放 | 无码人妻久久一区二区三区不卡 | 国产亚洲精品精品国产亚洲综合 | 97精品国产97久久久久久免费 | 久久视频在线观看精品 | 久久久精品欧美一区二区免费 | 四虎影视成人永久免费观看视频 | 亚洲午夜无码久久 | 67194成是人免费无码 | 无码吃奶揉捏奶头高潮视频 | 蜜桃视频韩日免费播放 | 成人三级无码视频在线观看 | 99久久人妻精品免费二区 | 天干天干啦夜天干天2017 | 曰本女人与公拘交酡免费视频 | 少妇人妻av毛片在线看 | 精品无人国产偷自产在线 | 国内精品一区二区三区不卡 | 久久国产精品萌白酱免费 | 秋霞成人午夜鲁丝一区二区三区 | 日本一区二区三区免费高清 | 麻豆国产97在线 | 欧洲 | 永久免费观看美女裸体的网站 | 成年美女黄网站色大免费全看 | 无码国内精品人妻少妇 | 日本又色又爽又黄的a片18禁 | 国产欧美亚洲精品a | 在线观看国产一区二区三区 | 玩弄人妻少妇500系列视频 | 国产成人一区二区三区在线观看 | 中文无码伦av中文字幕 | 97se亚洲精品一区 | 精品国产国产综合精品 | 俺去俺来也www色官网 | √天堂资源地址中文在线 | 国产女主播喷水视频在线观看 | 精品久久久无码人妻字幂 | 伊人久久大香线焦av综合影院 | 国产亲子乱弄免费视频 | 99精品无人区乱码1区2区3区 | 国产性生大片免费观看性 | 又大又黄又粗又爽的免费视频 | 中文字幕人妻无码一夲道 | 亚洲熟妇色xxxxx欧美老妇y | 又粗又大又硬毛片免费看 | 国产偷国产偷精品高清尤物 | 国产乱人伦av在线无码 | 亚洲色无码一区二区三区 | 女人和拘做爰正片视频 | 少妇一晚三次一区二区三区 | 一个人免费观看的www视频 | 99精品国产综合久久久久五月天 | 真人与拘做受免费视频 | 国产成人无码午夜视频在线观看 | 内射后入在线观看一区 | 精品一区二区三区无码免费视频 | 日本免费一区二区三区最新 | 偷窥日本少妇撒尿chinese | 中文字幕无码人妻少妇免费 | 国产成人无码av片在线观看不卡 | 日韩av无码一区二区三区不卡 | 一二三四社区在线中文视频 | 国模大胆一区二区三区 | 波多野结衣一区二区三区av免费 | 精品夜夜澡人妻无码av蜜桃 | 亚洲一区二区三区在线观看网站 | 性开放的女人aaa片 | 女人高潮内射99精品 | 永久免费观看国产裸体美女 | 无码午夜成人1000部免费视频 | 久久人人97超碰a片精品 | 久久亚洲日韩精品一区二区三区 | 成人欧美一区二区三区黑人免费 | 国产两女互慰高潮视频在线观看 | 久久久久亚洲精品中文字幕 | 波多野结衣av一区二区全免费观看 | 日本免费一区二区三区最新 | 日韩亚洲欧美精品综合 | 亚洲精品久久久久久久久久久 | 无套内谢的新婚少妇国语播放 | 国语精品一区二区三区 | 久久综合久久自在自线精品自 | 国产69精品久久久久app下载 | 国产乱子伦视频在线播放 | 国产激情无码一区二区 | 东京热一精品无码av | 无码国模国产在线观看 | 亚洲七七久久桃花影院 | 国产乱人伦偷精品视频 | 日韩人妻无码一区二区三区久久99 | 正在播放东北夫妻内射 | 久久久久人妻一区精品色欧美 | 国产香蕉尹人综合在线观看 | 欧美激情内射喷水高潮 | 三级4级全黄60分钟 | 国精品人妻无码一区二区三区蜜柚 | 玩弄人妻少妇500系列视频 | 人妻少妇被猛烈进入中文字幕 | 中文字幕久久久久人妻 | 日本在线高清不卡免费播放 | 麻豆国产丝袜白领秘书在线观看 | 国产精品无码成人午夜电影 | 成人免费无码大片a毛片 | 国产精品爱久久久久久久 | 鲁鲁鲁爽爽爽在线视频观看 | 97人妻精品一区二区三区 | 18禁黄网站男男禁片免费观看 | 婷婷丁香六月激情综合啪 | 无码毛片视频一区二区本码 | 国产精品99久久精品爆乳 | 特黄特色大片免费播放器图片 | 麻豆国产人妻欲求不满 | 国产精品久久久久影院嫩草 | 久久久婷婷五月亚洲97号色 | 亚洲精品美女久久久久久久 | 福利一区二区三区视频在线观看 | 性色欲网站人妻丰满中文久久不卡 | 丝袜 中出 制服 人妻 美腿 | 麻豆国产97在线 | 欧洲 | 国产精品对白交换视频 | 日本熟妇大屁股人妻 | 伊人久久大香线蕉亚洲 | 熟妇人妻中文av无码 | 国产亚洲人成a在线v网站 | 亚洲无人区午夜福利码高清完整版 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 国产 精品 自在自线 | 精品亚洲成av人在线观看 | 久久久久免费看成人影片 | 性开放的女人aaa片 | 人妻插b视频一区二区三区 | а√资源新版在线天堂 | 2020久久香蕉国产线看观看 | 亚洲精品国产精品乱码视色 | 荫蒂被男人添的好舒服爽免费视频 | 国产精品人妻一区二区三区四 | 动漫av一区二区在线观看 | 高潮毛片无遮挡高清免费视频 | 无码精品国产va在线观看dvd | 亚洲国产午夜精品理论片 | 一本精品99久久精品77 | 色综合久久中文娱乐网 | 亚洲中文字幕成人无码 | 欧美第一黄网免费网站 | 国产人成高清在线视频99最全资源 | 国产欧美精品一区二区三区 | 人妻互换免费中文字幕 | 自拍偷自拍亚洲精品被多人伦好爽 | 精品久久久无码人妻字幂 | 成人无码影片精品久久久 | 久久久久se色偷偷亚洲精品av | 精品 日韩 国产 欧美 视频 | 亚洲乱码国产乱码精品精 | 性欧美大战久久久久久久 | 国产亚洲精品精品国产亚洲综合 | 18无码粉嫩小泬无套在线观看 | 久久精品中文字幕大胸 | 性欧美videos高清精品 | 少妇的肉体aa片免费 | av人摸人人人澡人人超碰下载 | 亚洲中文字幕无码中字 | 未满小14洗澡无码视频网站 | 九九久久精品国产免费看小说 | 亚洲熟妇色xxxxx亚洲 | 免费无码一区二区三区蜜桃大 | 高潮毛片无遮挡高清免费视频 | 国产精品手机免费 | 久久久久久久女国产乱让韩 | 精品人妻人人做人人爽夜夜爽 | 国产在线精品一区二区三区直播 | 国产艳妇av在线观看果冻传媒 | 偷窥村妇洗澡毛毛多 | 中文字幕亚洲情99在线 | 精品偷自拍另类在线观看 | 性色av无码免费一区二区三区 | 久精品国产欧美亚洲色aⅴ大片 | 一个人看的视频www在线 | 亚洲经典千人经典日产 | 久久综合九色综合97网 | 国产一区二区三区影院 | 国产人妻精品午夜福利免费 | 日本爽爽爽爽爽爽在线观看免 | 国产成人久久精品流白浆 | 久久久久久av无码免费看大片 | 国产免费久久精品国产传媒 | 强伦人妻一区二区三区视频18 | ass日本丰满熟妇pics | 色爱情人网站 | 日日碰狠狠躁久久躁蜜桃 | 国产精品久久国产精品99 | 久久精品一区二区三区四区 | 亚洲の无码国产の无码影院 | 久久综合激激的五月天 | 日韩视频 中文字幕 视频一区 | 日韩欧美中文字幕在线三区 | 大地资源中文第3页 | 熟妇人妻中文av无码 | 少妇一晚三次一区二区三区 | 国产精品自产拍在线观看 | 俺去俺来也在线www色官网 | 亚洲欧洲无卡二区视頻 | 欧美性猛交内射兽交老熟妇 | 国语精品一区二区三区 | 精品国产aⅴ无码一区二区 | 国产av一区二区精品久久凹凸 | 亚洲中文字幕在线观看 | 中文字幕无线码免费人妻 | 日韩欧美成人免费观看 | 大屁股大乳丰满人妻 | 台湾无码一区二区 | 欧美日韩一区二区免费视频 | 国产亚洲精品精品国产亚洲综合 | 色欲人妻aaaaaaa无码 | ass日本丰满熟妇pics | 天堂а√在线中文在线 | 婷婷综合久久中文字幕蜜桃三电影 | 亚洲自偷自拍另类第1页 | 全球成人中文在线 | 亚洲一区二区三区 | 久久精品女人的天堂av | 欧美丰满熟妇xxxx | 两性色午夜视频免费播放 | 久久久久久久女国产乱让韩 | 成 人影片 免费观看 | 欧美zoozzooz性欧美 | 亚洲综合久久一区二区 | 亚洲码国产精品高潮在线 | 欧美变态另类xxxx | 亚洲色在线无码国产精品不卡 | 成人欧美一区二区三区黑人免费 | 久久久久久a亚洲欧洲av冫 | 漂亮人妻洗澡被公强 日日躁 | 国语自产偷拍精品视频偷 | 色综合视频一区二区三区 | 亚洲爆乳大丰满无码专区 | 精品国产成人一区二区三区 | 无遮无挡爽爽免费视频 | 国产精品人人妻人人爽 | 欧美激情一区二区三区成人 | 鲁鲁鲁爽爽爽在线视频观看 | 国产亚洲人成在线播放 | 亚洲精品久久久久久一区二区 | 成人精品视频一区二区 | 国产高清不卡无码视频 | 蜜桃视频韩日免费播放 | 久久99精品久久久久婷婷 | 在线观看免费人成视频 | 精品国精品国产自在久国产87 | 欧美日韩综合一区二区三区 | 亚洲国产高清在线观看视频 | 人妻少妇被猛烈进入中文字幕 | 成人欧美一区二区三区黑人 | 亚洲区欧美区综合区自拍区 | 国产三级久久久精品麻豆三级 | 亚洲熟熟妇xxxx | 国产真人无遮挡作爱免费视频 | 国内丰满熟女出轨videos | 午夜精品久久久久久久 | 一二三四在线观看免费视频 | 国产另类ts人妖一区二区 | 国产精华av午夜在线观看 | 99精品国产综合久久久久五月天 | 亚洲自偷自拍另类第1页 | 亚洲欧洲日本综合aⅴ在线 | 激情内射日本一区二区三区 | 色五月丁香五月综合五月 | 精品熟女少妇av免费观看 | 国产精品99久久精品爆乳 | 久久久精品国产sm最大网站 | 成在人线av无码免费 | 中文字幕无码免费久久99 | 牲欲强的熟妇农村老妇女 | 亚洲精品一区二区三区婷婷月 | 成人无码视频免费播放 | 亚洲成a人片在线观看无码3d | 日本一本二本三区免费 | 无码av免费一区二区三区试看 | 日本在线高清不卡免费播放 | 成人综合网亚洲伊人 | 日本精品少妇一区二区三区 | 国产精品无码久久av | 久久精品无码一区二区三区 | 亚洲熟妇色xxxxx亚洲 | 人妻互换免费中文字幕 | 特大黑人娇小亚洲女 | 国产av一区二区三区最新精品 | 无码人妻精品一区二区三区不卡 | 国产无遮挡又黄又爽又色 | 野外少妇愉情中文字幕 | 爽爽影院免费观看 | 99久久久无码国产精品免费 | 18精品久久久无码午夜福利 | 国产色在线 | 国产 | 色偷偷人人澡人人爽人人模 | 少妇被黑人到高潮喷出白浆 | 久久精品中文字幕一区 | 国产成人综合色在线观看网站 | 国产成人无码一二三区视频 | 中文字幕无码人妻少妇免费 | 日日摸夜夜摸狠狠摸婷婷 | 精品日本一区二区三区在线观看 | 六十路熟妇乱子伦 | 在线观看免费人成视频 | 欧美黑人性暴力猛交喷水 | 成人一区二区免费视频 | 日韩av无码一区二区三区不卡 | 人人妻人人澡人人爽精品欧美 | 夜夜躁日日躁狠狠久久av | 日欧一片内射va在线影院 | 国精产品一区二区三区 | 亚洲精品欧美二区三区中文字幕 | 18禁止看的免费污网站 | 成人欧美一区二区三区黑人免费 | 久久国产自偷自偷免费一区调 | 国产无遮挡又黄又爽免费视频 | 国产猛烈高潮尖叫视频免费 | 成人精品视频一区二区三区尤物 | 一本一道久久综合久久 | 国产人妻精品一区二区三区 | 玩弄中年熟妇正在播放 | 中文久久乱码一区二区 | 国产福利视频一区二区 | 日韩在线不卡免费视频一区 | 国产精品办公室沙发 | 99riav国产精品视频 | а√资源新版在线天堂 | 亚洲精品成人福利网站 | 国产成人无码一二三区视频 | 久久久久久九九精品久 | 香港三级日本三级妇三级 | 日本一区二区更新不卡 | 99久久久无码国产aaa精品 | 亚洲无人区午夜福利码高清完整版 | 国产成人精品优优av | 国产精品亚洲综合色区韩国 | 九九久久精品国产免费看小说 | 国产精品久久国产精品99 | 88国产精品欧美一区二区三区 | 久久久久久久久888 | 亚洲色欲色欲欲www在线 | 性欧美疯狂xxxxbbbb | 麻豆人妻少妇精品无码专区 | 亚洲成a人一区二区三区 | 水蜜桃色314在线观看 | 国产美女极度色诱视频www | 日韩亚洲欧美精品综合 | 国产麻豆精品精东影业av网站 | 国产在线一区二区三区四区五区 | 久久久婷婷五月亚洲97号色 | 日日鲁鲁鲁夜夜爽爽狠狠 | 中国大陆精品视频xxxx | 亚洲人交乣女bbw | 亚洲精品久久久久久一区二区 | 日本欧美一区二区三区乱码 | 鲁一鲁av2019在线 | 日本熟妇大屁股人妻 | 久久精品一区二区三区四区 | 久久久久久亚洲精品a片成人 | 色婷婷综合激情综在线播放 | 性色欲网站人妻丰满中文久久不卡 | ass日本丰满熟妇pics | 我要看www免费看插插视频 | 在线成人www免费观看视频 | 一本久道高清无码视频 | 久久精品国产一区二区三区 | 99精品国产综合久久久久五月天 | 国产精品无码成人午夜电影 | 日本精品高清一区二区 | 亚洲熟熟妇xxxx | 亚洲码国产精品高潮在线 | 国产人妻人伦精品 | 亚洲 欧美 激情 小说 另类 | 亚洲熟熟妇xxxx | 国产精品亚洲а∨无码播放麻豆 | 自拍偷自拍亚洲精品被多人伦好爽 | 少妇无套内谢久久久久 | 国产精品久免费的黄网站 | 国产性生大片免费观看性 | 性做久久久久久久久 | 国内精品九九久久久精品 | 国产精品久久久久久无码 | 又粗又大又硬又长又爽 | 精品国精品国产自在久国产87 | 少妇人妻偷人精品无码视频 | 久久99国产综合精品 | 亚洲熟女一区二区三区 | 国产两女互慰高潮视频在线观看 | 色婷婷综合中文久久一本 | 在线观看免费人成视频 | 色一情一乱一伦一视频免费看 | 欧美国产日产一区二区 | 妺妺窝人体色www婷婷 | 少妇邻居内射在线 | 国产精品亚洲lv粉色 | 亚洲精品国产第一综合99久久 | 乱人伦人妻中文字幕无码久久网 | аⅴ资源天堂资源库在线 | 欧美精品国产综合久久 | 国产精品资源一区二区 | 精品水蜜桃久久久久久久 | 精品国产一区二区三区av 性色 | 99久久人妻精品免费二区 | 日韩亚洲欧美精品综合 | 人妻插b视频一区二区三区 | 国产内射爽爽大片视频社区在线 | 男女爱爱好爽视频免费看 | 国产性生交xxxxx无码 | 18无码粉嫩小泬无套在线观看 | 亚洲国产精品久久久天堂 | 欧美人与动性行为视频 | 黑人巨大精品欧美黑寡妇 | 成人无码影片精品久久久 | 好屌草这里只有精品 | 精品无码一区二区三区的天堂 | аⅴ资源天堂资源库在线 | 精品成人av一区二区三区 | 成 人影片 免费观看 | 精品欧美一区二区三区久久久 | 搡女人真爽免费视频大全 | 97夜夜澡人人爽人人喊中国片 | 人妻无码αv中文字幕久久琪琪布 | 午夜精品一区二区三区在线观看 | 野狼第一精品社区 | 国产人妻精品一区二区三区 | 牲欲强的熟妇农村老妇女视频 | 亚洲毛片av日韩av无码 | 精品人妻av区 | 成人毛片一区二区 | 97色伦图片97综合影院 | 亚洲成在人网站无码天堂 | 鲁一鲁av2019在线 | 亚洲精品一区二区三区在线 | 日韩欧美中文字幕在线三区 | 岛国片人妻三上悠亚 | 日本爽爽爽爽爽爽在线观看免 | 老司机亚洲精品影院 | 国产激情精品一区二区三区 | 国产超碰人人爽人人做人人添 | 成熟妇人a片免费看网站 | 国产 浪潮av性色四虎 | 天天躁夜夜躁狠狠是什么心态 | 中国女人内谢69xxxxxa片 | 欧美日韩视频无码一区二区三 | 少妇无码吹潮 | 小sao货水好多真紧h无码视频 | a在线观看免费网站大全 | 天天躁夜夜躁狠狠是什么心态 | 国产精品va在线播放 | 国产亚洲日韩欧美另类第八页 | 伊人久久大香线焦av综合影院 | 国产偷抇久久精品a片69 | 97色伦图片97综合影院 | 亚洲国产精品成人久久蜜臀 | 国产情侣作爱视频免费观看 | 日本精品人妻无码77777 天堂一区人妻无码 | 性色av无码免费一区二区三区 | 国产精品国产自线拍免费软件 | 中文字幕人妻丝袜二区 | 日韩在线不卡免费视频一区 | 丰满肥臀大屁股熟妇激情视频 | 色窝窝无码一区二区三区色欲 | 日韩精品无码一区二区中文字幕 | 正在播放老肥熟妇露脸 | 内射老妇bbwx0c0ck | 久久综合给合久久狠狠狠97色 | 日韩欧美中文字幕公布 | 色婷婷久久一区二区三区麻豆 | 国产精品久久久 | 精品一二三区久久aaa片 | 好男人社区资源 | 国产精品久免费的黄网站 | 老熟女乱子伦 | 国产av一区二区三区最新精品 | 午夜无码人妻av大片色欲 | 久久久久久av无码免费看大片 | 永久免费精品精品永久-夜色 | 亚洲欧洲日本无在线码 | 色诱久久久久综合网ywww | 国产做国产爱免费视频 | 日本免费一区二区三区最新 | 国产suv精品一区二区五 | 西西人体www44rt大胆高清 | 人人妻人人澡人人爽欧美一区九九 | 熟妇人妻无码xxx视频 | 亚洲の无码国产の无码步美 | 国产精品.xx视频.xxtv | 国产无av码在线观看 | v一区无码内射国产 | 中文字幕人妻无码一夲道 | 天堂久久天堂av色综合 | 国产精品18久久久久久麻辣 | 亚洲精品国产第一综合99久久 | 97久久国产亚洲精品超碰热 | 国产成人精品久久亚洲高清不卡 | 国产超级va在线观看视频 | 久久99精品久久久久婷婷 | 性色av无码免费一区二区三区 | 67194成是人免费无码 | 国产精品多人p群无码 | 亚洲小说图区综合在线 | 在线播放免费人成毛片乱码 | 亚洲色大成网站www | 精品乱码久久久久久久 | 国产区女主播在线观看 | 天堂在线观看www | 色综合久久中文娱乐网 | 日本一卡2卡3卡4卡无卡免费网站 国产一区二区三区影院 | 99riav国产精品视频 | 蜜臀aⅴ国产精品久久久国产老师 | 久久精品国产一区二区三区肥胖 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 少妇性l交大片欧洲热妇乱xxx | 2020久久香蕉国产线看观看 | 欧美 日韩 人妻 高清 中文 | 影音先锋中文字幕无码 | 狂野欧美激情性xxxx | 少妇无码av无码专区在线观看 | 亚洲欧美综合区丁香五月小说 | 欧洲精品码一区二区三区免费看 | 成人一区二区免费视频 | 欧美老人巨大xxxx做受 | 51国偷自产一区二区三区 | 欧美xxxx黑人又粗又长 | 丰满人妻一区二区三区免费视频 | 亚洲色大成网站www国产 | 丰满诱人的人妻3 | 成人欧美一区二区三区 | 一本色道久久综合狠狠躁 | 亚洲日韩乱码中文无码蜜桃臀网站 | 亚洲s码欧洲m码国产av | 亚洲精品久久久久久一区二区 | 欧美人与牲动交xxxx | 国产成人无码一二三区视频 | 亚洲精品久久久久久久久久久 | 日日鲁鲁鲁夜夜爽爽狠狠 | 国产精品久久精品三级 | 欧美丰满少妇xxxx性 | 亚洲精品国产精品乱码不卡 | 中文字幕人妻丝袜二区 | 无码人中文字幕 | 久久久久人妻一区精品色欧美 | 久久久国产精品无码免费专区 | 亚洲精品一区三区三区在线观看 | 国产亚洲精品久久久ai换 | 狂野欧美性猛xxxx乱大交 | 国产精品人人爽人人做我的可爱 | 欧美日本精品一区二区三区 | 鲁鲁鲁爽爽爽在线视频观看 | av无码不卡在线观看免费 | 人妻无码αv中文字幕久久琪琪布 | 欧美zoozzooz性欧美 | 日日躁夜夜躁狠狠躁 | 77777熟女视频在线观看 а天堂中文在线官网 | 精品人妻中文字幕有码在线 | 久久久精品成人免费观看 | 亚洲国产欧美国产综合一区 | 大屁股大乳丰满人妻 | 国产午夜视频在线观看 | 日本精品久久久久中文字幕 | 国内精品人妻无码久久久影院蜜桃 | 国产精品人妻一区二区三区四 | 久久99精品久久久久久动态图 | 日韩精品无码免费一区二区三区 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 亚洲区欧美区综合区自拍区 | 亚洲中文无码av永久不收费 | 伊在人天堂亚洲香蕉精品区 | 99久久无码一区人妻 | 久久久久久久女国产乱让韩 | 国产香蕉尹人视频在线 | 天堂在线观看www | 强伦人妻一区二区三区视频18 | 东京热一精品无码av | 国产在线一区二区三区四区五区 | 丁香花在线影院观看在线播放 | 中文字幕乱码中文乱码51精品 | 久久国产精品偷任你爽任你 | 少妇高潮喷潮久久久影院 | 未满小14洗澡无码视频网站 | 亚洲国产精品久久人人爱 | 女高中生第一次破苞av | 国产成人精品一区二区在线小狼 | 国产女主播喷水视频在线观看 | 亚洲伊人久久精品影院 | 乱码午夜-极国产极内射 | 久久www免费人成人片 | 免费看男女做好爽好硬视频 | 日日碰狠狠丁香久燥 | 3d动漫精品啪啪一区二区中 | 亚洲人成网站免费播放 | 大色综合色综合网站 | 亚洲中文字幕va福利 | 色狠狠av一区二区三区 | 性欧美牲交xxxxx视频 | 夫妻免费无码v看片 | 色爱情人网站 | 日韩视频 中文字幕 视频一区 | 蜜桃av蜜臀av色欲av麻 999久久久国产精品消防器材 | 亚洲成av人影院在线观看 | 大乳丰满人妻中文字幕日本 | 亚洲中文字幕va福利 | 久久无码专区国产精品s | 亚洲 激情 小说 另类 欧美 | 中文亚洲成a人片在线观看 | 欧美老妇交乱视频在线观看 | 正在播放老肥熟妇露脸 | 亚洲 日韩 欧美 成人 在线观看 | 久久精品国产99精品亚洲 | 国产成人综合美国十次 | 久久无码中文字幕免费影院蜜桃 | 亚洲综合久久一区二区 | 国产成人一区二区三区别 | 自拍偷自拍亚洲精品被多人伦好爽 | 亚洲 日韩 欧美 成人 在线观看 | 丰满肥臀大屁股熟妇激情视频 | 香港三级日本三级妇三级 | 日本精品人妻无码免费大全 | 亚洲乱码中文字幕在线 | 无码国产乱人伦偷精品视频 | 人人澡人摸人人添 | 国产极品视觉盛宴 | 日韩精品无码免费一区二区三区 | 88国产精品欧美一区二区三区 | 波多野结衣aⅴ在线 | 日日摸天天摸爽爽狠狠97 | 最新版天堂资源中文官网 | 国产成人无码午夜视频在线观看 | 亚洲无人区午夜福利码高清完整版 | 国精产品一区二区三区 | 欧美熟妇另类久久久久久多毛 | 亚洲区小说区激情区图片区 | 国产av一区二区精品久久凹凸 | 亚洲国产午夜精品理论片 | 精品国产青草久久久久福利 | 午夜成人1000部免费视频 | 在线天堂新版最新版在线8 | 纯爱无遮挡h肉动漫在线播放 | 欧美日韩视频无码一区二区三 | 亚洲国产精品无码一区二区三区 | 色婷婷久久一区二区三区麻豆 | 性生交大片免费看女人按摩摩 | 亚洲欧美国产精品久久 | 99久久婷婷国产综合精品青草免费 | 久久久www成人免费毛片 | 色欲综合久久中文字幕网 | 亚洲一区二区三区国产精华液 | 亚洲日本va午夜在线电影 | 人人爽人人澡人人高潮 | 欧美日本精品一区二区三区 | 久久亚洲日韩精品一区二区三区 | 中文字幕日产无线码一区 | 荫蒂被男人添的好舒服爽免费视频 | 久久综合网欧美色妞网 | 天堂无码人妻精品一区二区三区 | 黑人大群体交免费视频 | 六十路熟妇乱子伦 | 亚洲日韩一区二区三区 | 妺妺窝人体色www婷婷 | 日本精品人妻无码免费大全 | 中文字幕无码热在线视频 | 亚洲欧洲无卡二区视頻 | 国产深夜福利视频在线 | 国产精品亚洲五月天高清 | 伊在人天堂亚洲香蕉精品区 | 国产莉萝无码av在线播放 | 久久精品中文闷骚内射 | 中文字幕无码日韩欧毛 | 天天做天天爱天天爽综合网 | 久久精品丝袜高跟鞋 | 天堂久久天堂av色综合 | 日日摸日日碰夜夜爽av | 免费观看的无遮挡av | 亚洲精品成人福利网站 | 亚洲精品成人av在线 | 亚洲一区二区三区国产精华液 | 亚洲色欲色欲天天天www | 丰满人妻精品国产99aⅴ | 扒开双腿疯狂进出爽爽爽视频 | 国产精品.xx视频.xxtv | 亚洲成色www久久网站 | 一本久道久久综合狠狠爱 | 午夜福利电影 | 欧美喷潮久久久xxxxx | 一本精品99久久精品77 | 亚洲一区二区三区偷拍女厕 | 国产绳艺sm调教室论坛 | 久久午夜无码鲁丝片 | 国产精品毛片一区二区 | 婷婷丁香五月天综合东京热 | 亚洲中文字幕在线无码一区二区 | 国产日产欧产精品精品app | 狂野欧美性猛xxxx乱大交 | 久久亚洲精品中文字幕无男同 | 美女扒开屁股让男人桶 | 人妻体内射精一区二区三四 | 亚洲热妇无码av在线播放 | 久久精品国产大片免费观看 | 亚洲一区二区三区四区 | 无码吃奶揉捏奶头高潮视频 | 欧美午夜特黄aaaaaa片 | 国产亚洲精品久久久久久久久动漫 | 一本色道久久综合狠狠躁 | 福利一区二区三区视频在线观看 | 亚洲中文字幕av在天堂 | 日本大香伊一区二区三区 | 人妻无码久久精品人妻 | 中文精品无码中文字幕无码专区 | 欧美xxxxx精品 | 国产午夜亚洲精品不卡 | 久久精品国产日本波多野结衣 | 成人欧美一区二区三区黑人 | 无码av岛国片在线播放 | а√天堂www在线天堂小说 | 蜜桃无码一区二区三区 | 婷婷综合久久中文字幕蜜桃三电影 | 日韩精品乱码av一区二区 | 婷婷五月综合缴情在线视频 | 亚洲小说图区综合在线 | 亚洲 a v无 码免 费 成 人 a v | 久久精品一区二区三区四区 | 国产欧美精品一区二区三区 | 国产莉萝无码av在线播放 | 少妇性俱乐部纵欲狂欢电影 | 欧美xxxxx精品 | 无码国产激情在线观看 | 美女扒开屁股让男人桶 | 国产口爆吞精在线视频 | 成人aaa片一区国产精品 | 亚洲国产精品毛片av不卡在线 | 日本又色又爽又黄的a片18禁 | 久久99精品国产.久久久久 | 帮老师解开蕾丝奶罩吸乳网站 | 亚洲人成网站免费播放 | 丰腴饱满的极品熟妇 | 麻豆国产丝袜白领秘书在线观看 | 欧美日韩一区二区三区自拍 | 亚洲精品国偷拍自产在线麻豆 | 精品国产一区二区三区四区 | 樱花草在线社区www | 亚洲人成影院在线无码按摩店 | 亚无码乱人伦一区二区 | 国产美女极度色诱视频www | 国产精品国产自线拍免费软件 | 国产亚av手机在线观看 | 国产精品无码成人午夜电影 | 无码乱肉视频免费大全合集 | 久久久久成人片免费观看蜜芽 | 亚洲精品久久久久中文第一幕 | 久久人人97超碰a片精品 | 国产内射爽爽大片视频社区在线 | 久久久中文字幕日本无吗 | 欧洲熟妇色 欧美 | 大地资源网第二页免费观看 | 精品日本一区二区三区在线观看 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 国产精品无码一区二区桃花视频 | 精品久久久无码人妻字幂 | 日韩欧美群交p片內射中文 | 免费无码一区二区三区蜜桃大 | 久久综合久久自在自线精品自 | 男女爱爱好爽视频免费看 | 给我免费的视频在线观看 | 99久久亚洲精品无码毛片 | 国产亚洲精品久久久久久大师 | 国产精品内射视频免费 | 亚洲中文字幕乱码av波多ji | 国产综合在线观看 | 日本va欧美va欧美va精品 | 成人影院yy111111在线观看 | 夜先锋av资源网站 | 一本久久伊人热热精品中文字幕 | 亚洲成av人在线观看网址 | 国产综合在线观看 | 日韩欧美成人免费观看 | 亚洲精品国产第一综合99久久 | 国产性生交xxxxx无码 | 国产亚洲精品久久久久久久久动漫 | 亚洲成av人影院在线观看 | 亚洲一区二区三区四区 | 捆绑白丝粉色jk震动捧喷白浆 | 亚欧洲精品在线视频免费观看 | 娇妻被黑人粗大高潮白浆 | 高清无码午夜福利视频 | 亚洲 高清 成人 动漫 | 西西人体www44rt大胆高清 | 大地资源网第二页免费观看 | 男人和女人高潮免费网站 | 久精品国产欧美亚洲色aⅴ大片 | 少妇高潮一区二区三区99 | 色窝窝无码一区二区三区色欲 | 国产精品亚洲五月天高清 | 丰满人妻被黑人猛烈进入 | 久久人人97超碰a片精品 | 久久精品99久久香蕉国产色戒 | 日本一区二区更新不卡 | 樱花草在线播放免费中文 | 国产精品va在线播放 | 波多野结衣 黑人 | 一个人看的www免费视频在线观看 | 国产三级精品三级男人的天堂 | 亚洲精品成a人在线观看 | 999久久久国产精品消防器材 | 久久久www成人免费毛片 | 麻豆果冻传媒2021精品传媒一区下载 | 成年美女黄网站色大免费视频 | 国产成人无码av一区二区 | 久久国产精品二国产精品 | 美女极度色诱视频国产 | 成人综合网亚洲伊人 | 日日摸夜夜摸狠狠摸婷婷 | 人妻少妇被猛烈进入中文字幕 | 正在播放老肥熟妇露脸 | 色五月丁香五月综合五月 | 久久久久久久人妻无码中文字幕爆 | 无码av最新清无码专区吞精 | 久久午夜无码鲁丝片秋霞 | 欧美自拍另类欧美综合图片区 | 久久99精品国产麻豆蜜芽 | 任你躁国产自任一区二区三区 | 精品国产一区二区三区av 性色 | 麻豆国产97在线 | 欧洲 | 狠狠噜狠狠狠狠丁香五月 | 国产精品国产自线拍免费软件 | 嫩b人妻精品一区二区三区 | 女人被男人爽到呻吟的视频 | 日本丰满护士爆乳xxxx | 亚洲色在线无码国产精品不卡 | 国产无遮挡吃胸膜奶免费看 | 精品水蜜桃久久久久久久 | a国产一区二区免费入口 | 天天拍夜夜添久久精品大 | 久久人人爽人人爽人人片av高清 | 成在人线av无码免费 | 99久久精品无码一区二区毛片 | 丰满少妇熟乱xxxxx视频 | 日本护士xxxxhd少妇 | 人人妻人人澡人人爽欧美一区 | 小泽玛莉亚一区二区视频在线 | 日韩人妻无码一区二区三区久久99 | 午夜无码区在线观看 | 久久久久久久人妻无码中文字幕爆 | 国产精品第一区揄拍无码 | 精品国产aⅴ无码一区二区 | 狂野欧美性猛交免费视频 | 牲欲强的熟妇农村老妇女 | 无码av岛国片在线播放 | 久久熟妇人妻午夜寂寞影院 | 暴力强奷在线播放无码 | 亚洲中文字幕无码中文字在线 | 亚拍精品一区二区三区探花 | 成人亚洲精品久久久久 | 国产亚洲精品久久久ai换 | 免费无码的av片在线观看 | 亚洲欧美精品伊人久久 | 小泽玛莉亚一区二区视频在线 | 久久人人爽人人爽人人片av高清 | 亚洲小说春色综合另类 | √天堂中文官网8在线 | 国产精品人人爽人人做我的可爱 | 亚洲日韩一区二区三区 | 久久伊人色av天堂九九小黄鸭 | 久久国产精品二国产精品 | 久久国产精品二国产精品 | 国产人妻人伦精品 | 精品国产国产综合精品 | 国产人妻精品一区二区三区 | 亚洲天堂2017无码中文 | 精品无码一区二区三区爱欲 | 国产成人综合在线女婷五月99播放 | 久久精品无码一区二区三区 | 性做久久久久久久久 | 国产乱人伦app精品久久 国产在线无码精品电影网 国产国产精品人在线视 | 日本爽爽爽爽爽爽在线观看免 | 97久久精品无码一区二区 | √天堂资源地址中文在线 | 国产精品无码成人午夜电影 | 日韩 欧美 动漫 国产 制服 | 曰韩少妇内射免费播放 | 伊人久久大香线蕉午夜 | 国产9 9在线 | 中文 | 中文字幕精品av一区二区五区 | 中文字幕人妻无码一区二区三区 | 最新国产乱人伦偷精品免费网站 | 欧洲欧美人成视频在线 | 精品午夜福利在线观看 | 国产精品久久久久7777 | 无码福利日韩神码福利片 | 国产精品国产三级国产专播 | 亚洲成av人综合在线观看 | 中文精品久久久久人妻不卡 | 国产免费观看黄av片 | 最近中文2019字幕第二页 | 中文字幕人妻无码一区二区三区 | 娇妻被黑人粗大高潮白浆 | 欧美精品在线观看 | 无码成人精品区在线观看 | 精品一区二区三区无码免费视频 | 日日麻批免费40分钟无码 | 久久久亚洲欧洲日产国码αv | 荫蒂添的好舒服视频囗交 | 无遮挡啪啪摇乳动态图 | 丰满肥臀大屁股熟妇激情视频 | 国产精品人妻一区二区三区四 | 国内少妇偷人精品视频免费 | 精品无码一区二区三区的天堂 | av香港经典三级级 在线 | 精品久久久久久人妻无码中文字幕 | 99精品视频在线观看免费 | 国产三级久久久精品麻豆三级 | 99久久人妻精品免费二区 | 亚洲国产一区二区三区在线观看 | 真人与拘做受免费视频 | 亚洲欧美国产精品久久 | 亚洲 日韩 欧美 成人 在线观看 | 亚洲精品午夜国产va久久成人 | 国产精品久久久久久无码 | 成人一在线视频日韩国产 | 老子影院午夜精品无码 | 在线a亚洲视频播放在线观看 | 西西人体www44rt大胆高清 | 夜精品a片一区二区三区无码白浆 | 久久久久久久女国产乱让韩 | 风流少妇按摩来高潮 | 中文久久乱码一区二区 | 色狠狠av一区二区三区 | 亚洲国产精品无码久久久久高潮 | 久久久久成人片免费观看蜜芽 | 亚洲人成无码网www | 蜜桃臀无码内射一区二区三区 | 又黄又爽又色的视频 | 红桃av一区二区三区在线无码av | 在线看片无码永久免费视频 | 激情人妻另类人妻伦 | 夜夜高潮次次欢爽av女 | 性欧美熟妇videofreesex | 国产精品自产拍在线观看 | 人妻人人添人妻人人爱 | 人人妻人人澡人人爽欧美一区 | 精品厕所偷拍各类美女tp嘘嘘 | 高潮毛片无遮挡高清免费视频 | 在线视频网站www色 | 国产熟女一区二区三区四区五区 | 久久久久久久女国产乱让韩 | 国产免费久久精品国产传媒 | 国产成人一区二区三区别 | 少妇无码一区二区二三区 | 双乳奶水饱满少妇呻吟 | 天堂亚洲免费视频 | 精品国产国产综合精品 | 国产高清av在线播放 | 极品尤物被啪到呻吟喷水 | 久久视频在线观看精品 | 成人免费视频一区二区 | 久久久久久a亚洲欧洲av冫 | 精品欧洲av无码一区二区三区 | 日韩精品成人一区二区三区 | 日本丰满熟妇videos | 色一情一乱一伦一区二区三欧美 | 九月婷婷人人澡人人添人人爽 | 在线播放亚洲第一字幕 | 在线播放无码字幕亚洲 | 国产精品国产自线拍免费软件 | 无码人妻精品一区二区三区不卡 | 少妇人妻av毛片在线看 | 免费国产黄网站在线观看 | 乱中年女人伦av三区 | 国内综合精品午夜久久资源 | 色噜噜亚洲男人的天堂 | 国产超碰人人爽人人做人人添 | 国产疯狂伦交大片 | 欧美国产日产一区二区 | 搡女人真爽免费视频大全 | 久久久久99精品国产片 | 国产一区二区三区精品视频 | 成人无码影片精品久久久 | 激情国产av做激情国产爱 | 青青青爽视频在线观看 | 无码毛片视频一区二区本码 | 东京热无码av男人的天堂 | 国产精品无码永久免费888 | 国产肉丝袜在线观看 | 人人妻人人澡人人爽欧美精品 | 99久久人妻精品免费一区 | 国产内射爽爽大片视频社区在线 | 国产精品igao视频网 | 免费无码的av片在线观看 | 国产精品-区区久久久狼 | 女人被男人躁得好爽免费视频 | 十八禁真人啪啪免费网站 | 亚洲另类伦春色综合小说 | 高清不卡一区二区三区 | 中文字幕+乱码+中文字幕一区 | 宝宝好涨水快流出来免费视频 | 日韩人妻无码一区二区三区久久99 | 色综合天天综合狠狠爱 | 国精产品一品二品国精品69xx | 内射欧美老妇wbb | 国产超碰人人爽人人做人人添 | 精品夜夜澡人妻无码av蜜桃 | 丝袜足控一区二区三区 | 亚洲人交乣女bbw | 日本va欧美va欧美va精品 | 日产精品高潮呻吟av久久 | 精品无码av一区二区三区 | 性生交片免费无码看人 | 久久无码专区国产精品s | 波多野结衣av一区二区全免费观看 | 97无码免费人妻超级碰碰夜夜 | 天堂а√在线地址中文在线 | 欧美日韩人成综合在线播放 | 久久久久免费精品国产 | 日韩精品无码一区二区中文字幕 | 久久精品国产精品国产精品污 | 免费看男女做好爽好硬视频 | 欧美成人家庭影院 | 国产精品久久国产精品99 | 成人一区二区免费视频 | 亚洲s色大片在线观看 | 日韩人妻无码中文字幕视频 | 欧美日本日韩 | 天堂在线观看www | 激情人妻另类人妻伦 | 中文字幕 人妻熟女 | www国产亚洲精品久久网站 | 国产黄在线观看免费观看不卡 | 99久久99久久免费精品蜜桃 | 久久99精品久久久久久动态图 | 狠狠cao日日穞夜夜穞av | 精品熟女少妇av免费观看 | 婷婷丁香五月天综合东京热 | 久久久精品人妻久久影视 | 人人妻人人澡人人爽欧美精品 | 亚洲成a人片在线观看日本 | 免费网站看v片在线18禁无码 | 又色又爽又黄的美女裸体网站 | 国产av一区二区精品久久凹凸 | √8天堂资源地址中文在线 | 亚洲a无码综合a国产av中文 | 麻豆果冻传媒2021精品传媒一区下载 | 天下第一社区视频www日本 | 人妻少妇精品无码专区动漫 | 捆绑白丝粉色jk震动捧喷白浆 | 亚洲成a人一区二区三区 | 国产成人综合在线女婷五月99播放 | 亚洲一区av无码专区在线观看 | 精品一区二区三区波多野结衣 | 成人综合网亚洲伊人 | 一本久久a久久精品亚洲 | 少妇被粗大的猛进出69影院 | 国产精品久免费的黄网站 | 青青青爽视频在线观看 | 亚洲国产精品成人久久蜜臀 | 亚洲色无码一区二区三区 | 国产精品无码久久av | 亚洲理论电影在线观看 | 丝袜足控一区二区三区 | 久久精品人人做人人综合 | 欧美性猛交xxxx富婆 | 强辱丰满人妻hd中文字幕 | aa片在线观看视频在线播放 | 丰满少妇高潮惨叫视频 | 久久99久久99精品中文字幕 | 国产性生大片免费观看性 | 97久久超碰中文字幕 | 乱人伦中文视频在线观看 | 亚洲日本va中文字幕 | 激情亚洲一区国产精品 | 国产成人精品三级麻豆 | 国产三级久久久精品麻豆三级 | 人妻互换免费中文字幕 | 麻豆md0077饥渴少妇 | 亚洲日韩av一区二区三区四区 | 中文字幕中文有码在线 | 亚洲国产成人a精品不卡在线 | 少妇人妻大乳在线视频 | 波多野结衣乳巨码无在线观看 | 日本爽爽爽爽爽爽在线观看免 | 人妻无码久久精品人妻 | 精品 日韩 国产 欧美 视频 | 精品国产青草久久久久福利 | 久久97精品久久久久久久不卡 | 荫蒂添的好舒服视频囗交 | 玩弄少妇高潮ⅹxxxyw | 无码人妻丰满熟妇区五十路百度 | 成人毛片一区二区 | 国产色精品久久人妻 | 97人妻精品一区二区三区 | 在线观看免费人成视频 | 久久99精品国产麻豆蜜芽 | 久久国产精品精品国产色婷婷 | 国产亲子乱弄免费视频 | 特黄特色大片免费播放器图片 | 女人被爽到呻吟gif动态图视看 | 国产精品成人av在线观看 | 噜噜噜亚洲色成人网站 | 激情内射日本一区二区三区 | 永久黄网站色视频免费直播 | 国产午夜无码精品免费看 | 国产办公室秘书无码精品99 | 成人无码视频免费播放 | 在线天堂新版最新版在线8 | 无遮挡国产高潮视频免费观看 | 97夜夜澡人人爽人人喊中国片 | 亚洲另类伦春色综合小说 | 国产明星裸体无码xxxx视频 | 国产精品久久久午夜夜伦鲁鲁 | 无套内谢老熟女 | 性欧美疯狂xxxxbbbb | 伊在人天堂亚洲香蕉精品区 | 300部国产真实乱 | 18精品久久久无码午夜福利 | 欧美精品国产综合久久 | 亚洲欧洲无卡二区视頻 | 日韩av无码中文无码电影 | 亚洲爆乳精品无码一区二区三区 | 精品无人区无码乱码毛片国产 | 99久久久无码国产aaa精品 | 国产精品无码久久av | 人妻aⅴ无码一区二区三区 | 午夜福利一区二区三区在线观看 | 极品嫩模高潮叫床 | 国产精品无套呻吟在线 | 国产精品久久精品三级 | 日本熟妇乱子伦xxxx | 大胆欧美熟妇xx | 亚洲人成影院在线观看 | 国内丰满熟女出轨videos | 两性色午夜视频免费播放 | 亚洲一区av无码专区在线观看 | 丝袜足控一区二区三区 | 精品偷拍一区二区三区在线看 | 国产网红无码精品视频 | 四虎国产精品一区二区 | 亚洲gv猛男gv无码男同 | 无码人妻av免费一区二区三区 | 国产在线无码精品电影网 | 久久亚洲精品中文字幕无男同 | 免费网站看v片在线18禁无码 | 疯狂三人交性欧美 | 性生交大片免费看女人按摩摩 | 无码帝国www无码专区色综合 | 偷窥日本少妇撒尿chinese | 欧美刺激性大交 | 久久无码人妻影院 | 免费国产黄网站在线观看 | 人人澡人人妻人人爽人人蜜桃 | 乱人伦中文视频在线观看 | www国产亚洲精品久久久日本 | 国内揄拍国内精品少妇国语 | 色欲久久久天天天综合网精品 | av无码久久久久不卡免费网站 | 国产在线一区二区三区四区五区 | 色婷婷av一区二区三区之红樱桃 | 成在人线av无码免观看麻豆 | 亚洲va中文字幕无码久久不卡 | 国产超碰人人爽人人做人人添 | 国产精品久久久久9999小说 | 日产国产精品亚洲系列 | 国产精品无码成人午夜电影 | 色窝窝无码一区二区三区色欲 | 夜精品a片一区二区三区无码白浆 | 国产偷国产偷精品高清尤物 | 欧美熟妇另类久久久久久多毛 | 免费观看激色视频网站 | 日本xxxx色视频在线观看免费 | 牲欲强的熟妇农村老妇女 | 亚洲中文字幕成人无码 | 亚洲中文字幕无码中文字在线 | 国产精品久久久 | 丰满少妇人妻久久久久久 | 中文字幕乱码中文乱码51精品 | 大肉大捧一进一出视频出来呀 | 国产无遮挡又黄又爽又色 | 久久精品人人做人人综合试看 | 一本久道久久综合婷婷五月 | 久久人妻内射无码一区三区 | 性色欲情网站iwww九文堂 | 国产莉萝无码av在线播放 | 国产精品人人妻人人爽 | 国产97人人超碰caoprom | 亚洲精品久久久久久一区二区 | 亚洲爆乳大丰满无码专区 | 欧美精品一区二区精品久久 | 午夜丰满少妇性开放视频 | 国产精品久久久 | 夜夜夜高潮夜夜爽夜夜爰爰 | 亚洲欧美日韩综合久久久 | 久久久www成人免费毛片 | 色综合天天综合狠狠爱 | 又粗又大又硬又长又爽 | 牲欲强的熟妇农村老妇女 | 狠狠色欧美亚洲狠狠色www | 一区二区三区乱码在线 | 欧洲 | 亚洲国产欧美国产综合一区 | 99久久久无码国产aaa精品 | 中文字幕无码人妻少妇免费 | 色综合久久久久综合一本到桃花网 | 亚洲日韩乱码中文无码蜜桃臀网站 | 在线看片无码永久免费视频 | 性色欲网站人妻丰满中文久久不卡 | 一本久道久久综合婷婷五月 | 伊人久久婷婷五月综合97色 | 国产特级毛片aaaaaaa高清 | 久久国语露脸国产精品电影 | 欧美性猛交xxxx富婆 | 亚洲乱亚洲乱妇50p | 中文字幕无码av波多野吉衣 | 国产精品久久国产三级国 | 久久99精品久久久久婷婷 | 国产一区二区不卡老阿姨 | 人人妻人人澡人人爽精品欧美 | 大肉大捧一进一出视频出来呀 | 九九久久精品国产免费看小说 | 成人无码精品一区二区三区 | 波多野结衣乳巨码无在线观看 | 人人妻人人澡人人爽欧美一区九九 | 欧美三级不卡在线观看 | 亚洲午夜久久久影院 | 国产熟妇高潮叫床视频播放 | 日韩欧美中文字幕在线三区 | 中文字幕无码免费久久99 | 在线播放免费人成毛片乱码 | 色五月丁香五月综合五月 | 中文字幕人妻无码一夲道 | 少妇人妻大乳在线视频 | 国产国语老龄妇女a片 | 亚洲人交乣女bbw | 乌克兰少妇xxxx做受 | 一本久久伊人热热精品中文字幕 | 亚洲人亚洲人成电影网站色 | 欧美性猛交xxxx富婆 | 日韩精品无码一区二区中文字幕 | 性做久久久久久久免费看 | 小鲜肉自慰网站xnxx | 国产又爽又黄又刺激的视频 | 国产猛烈高潮尖叫视频免费 | 国产精品久久久一区二区三区 | 青青草原综合久久大伊人精品 | 性色欲网站人妻丰满中文久久不卡 | 亚洲成在人网站无码天堂 | 国产精品理论片在线观看 | 亚洲精品国产品国语在线观看 | 亚洲精品中文字幕 | 宝宝好涨水快流出来免费视频 | 人妻熟女一区 | 人妻尝试又大又粗久久 | 国产精品久久久午夜夜伦鲁鲁 | 99国产欧美久久久精品 | 国产深夜福利视频在线 | 人妻夜夜爽天天爽三区 | 日韩精品无码免费一区二区三区 | 亚洲日韩av一区二区三区四区 | 日韩少妇白浆无码系列 | 国产亚洲精品久久久久久大师 | 亚洲理论电影在线观看 | 国产国产精品人在线视 | 久久人人爽人人爽人人片av高清 | 狠狠综合久久久久综合网 | 99久久久无码国产精品免费 | 狠狠色欧美亚洲狠狠色www | 亚洲精品午夜无码电影网 | 中文字幕av日韩精品一区二区 | 亚洲大尺度无码无码专区 | 国产人妻精品一区二区三区不卡 | ass日本丰满熟妇pics | 国产性生大片免费观看性 | www国产亚洲精品久久网站 | 午夜免费福利小电影 | 水蜜桃亚洲一二三四在线 | 亚洲色偷偷偷综合网 | 成 人 网 站国产免费观看 | 99riav国产精品视频 | 波多野结衣 黑人 | 亚洲成熟女人毛毛耸耸多 | 国产精品久久久久影院嫩草 | 中文字幕av无码一区二区三区电影 | 成人欧美一区二区三区 | 少妇性l交大片 | 无码精品国产va在线观看dvd | 51国偷自产一区二区三区 | 欧美日韩久久久精品a片 | 亚洲色偷偷男人的天堂 | 丰满人妻被黑人猛烈进入 | 夜夜躁日日躁狠狠久久av | 鲁鲁鲁爽爽爽在线视频观看 | 久久精品视频在线看15 | 女人被男人躁得好爽免费视频 | 奇米影视7777久久精品人人爽 | 国产亚洲欧美日韩亚洲中文色 | 欧美喷潮久久久xxxxx | 免费网站看v片在线18禁无码 | 久久人人爽人人人人片 | 天天拍夜夜添久久精品大 | 呦交小u女精品视频 | 久久综合网欧美色妞网 | 人人爽人人澡人人人妻 | 7777奇米四色成人眼影 | 在线观看欧美一区二区三区 | 老头边吃奶边弄进去呻吟 | 给我免费的视频在线观看 | 国产av剧情md精品麻豆 | аⅴ资源天堂资源库在线 | 欧美一区二区三区视频在线观看 | 88国产精品欧美一区二区三区 | 中文字幕精品av一区二区五区 | 国内少妇偷人精品视频 | 日韩精品a片一区二区三区妖精 | 色一情一乱一伦一视频免费看 | 成人三级无码视频在线观看 | 亚洲日韩av片在线观看 | 性欧美疯狂xxxxbbbb | 日本乱偷人妻中文字幕 | 任你躁国产自任一区二区三区 | 中文字幕无码免费久久9一区9 | 久久久久久久久蜜桃 | 一本精品99久久精品77 | 18精品久久久无码午夜福利 | 成人无码精品1区2区3区免费看 | 中文字幕乱妇无码av在线 | 亚欧洲精品在线视频免费观看 | 人妻夜夜爽天天爽三区 | 三上悠亚人妻中文字幕在线 | 国产av无码专区亚洲awww | 无码吃奶揉捏奶头高潮视频 | 国产又爽又猛又粗的视频a片 | 九九热爱视频精品 | 六月丁香婷婷色狠狠久久 | 青春草在线视频免费观看 | 亚洲国产精品久久人人爱 | 综合激情五月综合激情五月激情1 | 国产极品视觉盛宴 | 日韩亚洲欧美精品综合 | 丁香花在线影院观看在线播放 | 国产成人人人97超碰超爽8 | 精品久久久久香蕉网 | 亚洲中文字幕无码一久久区 | 久久久久久国产精品无码下载 | 国产av无码专区亚洲awww | 中文无码成人免费视频在线观看 | 日本饥渴人妻欲求不满 | 九月婷婷人人澡人人添人人爽 | 97人妻精品一区二区三区 | 毛片内射-百度 | 风流少妇按摩来高潮 | 久久久久久久女国产乱让韩 | 99久久精品无码一区二区毛片 | 妺妺窝人体色www在线小说 | 97色伦图片97综合影院 | 国产97色在线 | 免 | 色狠狠av一区二区三区 | 欧美 丝袜 自拍 制服 另类 | ass日本丰满熟妇pics | 在教室伦流澡到高潮hnp视频 | 亚洲国产精品成人久久蜜臀 | 精品成在人线av无码免费看 | 亚洲色大成网站www | 色综合久久久无码中文字幕 | 噜噜噜亚洲色成人网站 | 无码人妻黑人中文字幕 | 国产偷抇久久精品a片69 | 日韩亚洲欧美中文高清在线 | 一个人免费观看的www视频 | 久久www免费人成人片 | 国产无套粉嫩白浆在线 | 国产精品久久久久9999小说 | 玩弄人妻少妇500系列视频 | 水蜜桃色314在线观看 | 西西人体www44rt大胆高清 | 乱中年女人伦av三区 | 精品国产aⅴ无码一区二区 | 午夜男女很黄的视频 | 色综合久久久无码网中文 | 国产午夜无码精品免费看 | 7777奇米四色成人眼影 | 少妇高潮喷潮久久久影院 | 中文无码成人免费视频在线观看 | av无码久久久久不卡免费网站 | 日本大香伊一区二区三区 | 亚洲国产精品久久人人爱 | 亚洲人成人无码网www国产 | 国产麻豆精品一区二区三区v视界 | 欧美日韩一区二区综合 | 欧美老人巨大xxxx做受 | 97精品人妻一区二区三区香蕉 | 亚洲人亚洲人成电影网站色 | 亚洲一区二区三区含羞草 | 国产suv精品一区二区五 | 在线天堂新版最新版在线8 | 久久久久久久久蜜桃 | 国内老熟妇对白xxxxhd | 粗大的内捧猛烈进出视频 | 国产艳妇av在线观看果冻传媒 | 国产精品久久久久久久9999 | 美女张开腿让人桶 | 97久久精品无码一区二区 | 丰满少妇高潮惨叫视频 | 成人综合网亚洲伊人 | 三上悠亚人妻中文字幕在线 | 国产成人无码av一区二区 | 福利一区二区三区视频在线观看 | 国产成人综合美国十次 | 亚洲精品午夜国产va久久成人 | 亚洲熟女一区二区三区 | 欧美猛少妇色xxxxx | 宝宝好涨水快流出来免费视频 | 日韩精品无码一区二区中文字幕 | 97无码免费人妻超级碰碰夜夜 | 国产色精品久久人妻 | 老熟女重囗味hdxx69 | 久久视频在线观看精品 | 精品国产一区二区三区四区在线看 | 欧美午夜特黄aaaaaa片 | 一本色道婷婷久久欧美 | 亚洲中文字幕乱码av波多ji | 国产凸凹视频一区二区 | 性做久久久久久久久 | 精品久久8x国产免费观看 | 欧洲欧美人成视频在线 | 国产热a欧美热a在线视频 | 久久人人爽人人人人片 | 麻豆国产人妻欲求不满 | 妺妺窝人体色www在线小说 | 亚洲性无码av中文字幕 | 色窝窝无码一区二区三区色欲 | 日本熟妇大屁股人妻 | 亚洲精品国偷拍自产在线麻豆 | 性色欲网站人妻丰满中文久久不卡 | 欧美人与牲动交xxxx | 成人试看120秒体验区 | 中文精品久久久久人妻不卡 | 国产亚洲精品久久久久久国模美 | 国产成人人人97超碰超爽8 | 亚洲狠狠婷婷综合久久 | 欧美xxxxx精品 | 国精产品一品二品国精品69xx | 日本爽爽爽爽爽爽在线观看免 | 国产成人无码a区在线观看视频app | 国产网红无码精品视频 | 99久久99久久免费精品蜜桃 | 亚洲欧美精品aaaaaa片 | 成 人 免费观看网站 | 国产成人精品无码播放 | 99久久99久久免费精品蜜桃 | 精品无码一区二区三区爱欲 | 中文字幕乱妇无码av在线 | 男人扒开女人内裤强吻桶进去 | 2019nv天堂香蕉在线观看 | 精品成在人线av无码免费看 | 波多野42部无码喷潮在线 | 亚洲成av人在线观看网址 | 天堂а√在线地址中文在线 | 日本大乳高潮视频在线观看 | 色欲av亚洲一区无码少妇 | 老子影院午夜伦不卡 | 综合人妻久久一区二区精品 | 蜜桃av抽搐高潮一区二区 | 国产特级毛片aaaaaa高潮流水 | 国产一区二区不卡老阿姨 | 亚洲无人区午夜福利码高清完整版 | 麻豆人妻少妇精品无码专区 | 亚洲 日韩 欧美 成人 在线观看 | 蜜桃视频韩日免费播放 | 天天躁日日躁狠狠躁免费麻豆 | 少妇性俱乐部纵欲狂欢电影 | 成年美女黄网站色大免费视频 | 一本久道高清无码视频 | 色婷婷欧美在线播放内射 | 岛国片人妻三上悠亚 | 99er热精品视频 | 亚洲午夜无码久久 | 亚洲国产精品一区二区美利坚 | 亚洲一区二区三区国产精华液 | 中国女人内谢69xxxx | 久久综合久久自在自线精品自 | 精品无码一区二区三区的天堂 | 久久久久99精品成人片 | 欧美日韩在线亚洲综合国产人 | 一本久久伊人热热精品中文字幕 | 99国产精品白浆在线观看免费 | 亚洲日本va午夜在线电影 | 又黄又爽又色的视频 | 国产精品多人p群无码 | 成在人线av无码免观看麻豆 | 亚洲精品国偷拍自产在线观看蜜桃 | 成人亚洲精品久久久久 | 午夜精品一区二区三区的区别 | 任你躁国产自任一区二区三区 | 日日天干夜夜狠狠爱 | 久久伊人色av天堂九九小黄鸭 | 99麻豆久久久国产精品免费 | 中文精品无码中文字幕无码专区 | 国产在线aaa片一区二区99 | 欧美精品无码一区二区三区 | 日韩少妇内射免费播放 | 日欧一片内射va在线影院 | 亚洲一区二区三区四区 | 久久国产精品二国产精品 | 国内精品九九久久久精品 | 性生交大片免费看女人按摩摩 | 无码中文字幕色专区 | 久久久国产精品无码免费专区 | 东京无码熟妇人妻av在线网址 | 久久国产劲爆∧v内射 | 色诱久久久久综合网ywww | 国产精品久久久一区二区三区 | 一区二区三区高清视频一 | 欧美成人午夜精品久久久 | 国产人妻人伦精品1国产丝袜 | 国产 精品 自在自线 | 俺去俺来也www色官网 | 内射白嫩少妇超碰 | 大肉大捧一进一出视频出来呀 |