hw7

pdf

School

University of Washington *

*We aren’t endorsed by this school

Course

487

Subject

Economics

Date

May 31, 2024

Type

pdf

Pages

11

Uploaded by ChefTank6577

Report
hw7 Yuchen Zou 2023-11-13 ##1 Consider a market in which the seller of a product knows there are two types of consumers, a high and a low preference type, which are indistinguishable from one another. The firm can produce along a product quality spectrum, such as with cars. #a. If the firm decides to offer a low quality good, what risk do they run? If the firm decides to offer a low quality good, they will face running the risk of adverse selection. #b. What two options do they have to mitigate this risk? The two options that they have to miyigate this risk are product differentiation and signalling. #c. How much would the firm be willing to pay to identify each type of consumer and price discriminate accordingly? The amount that the firm would be willing to pay identify each type of consumer and price discrimate accprdingly will be determined according the expected increase in the profits. ##2 #a install.packages("xgboost",repos = "http://cran.us.r-project.org") ## Installing package into 'C:/Users/tamel/AppData/Local/R/win-library/4.2' ## (as 'lib' is unspecified) ## package 'xgboost' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'xgboost' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\xgboost\libs\x64\xgboost.dll ## to C:\Users\tamel\AppData\Local\R\win-library\4.2\xgboost\libs\x64\xgboost.dll: ## Permission denied ## Warning: restored 'xgboost' ## ## The downloaded binary packages are in ## C:\Users\tamel\AppData\Local\Temp\Rtmp8ApiIo\downloaded_packages install.packages("tidyverse",repos = "http://cran.us.r-project.org") ## Installing package into 'C:/Users/tamel/AppData/Local/R/win-library/4.2' ## (as 'lib' is unspecified)
## also installing the dependencies 'cli', 'lubridate', 'purrr', 'readr', 'readxl', 'tidyr' ## package 'cli' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'cli' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\cli\libs\x64\cli.dll to ## C:\Users\tamel\AppData\Local\R\win-library\4.2\cli\libs\x64\cli.dll: Permission ## denied ## Warning: restored 'cli' ## package 'lubridate' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'lubridate' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\lubridate\libs\x64\lubridate.dll ## to ## C:\Users\tamel\AppData\Local\R\win-library\4.2\lubridate\libs\x64\lubridate.dll: ## Permission denied ## Warning: restored 'lubridate' ## package 'purrr' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'purrr' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\purrr\libs\x64\purrr.dll ## to C:\Users\tamel\AppData\Local\R\win-library\4.2\purrr\libs\x64\purrr.dll: ## Permission denied ## Warning: restored 'purrr' ## package 'readr' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'readr' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\readr\libs\x64\readr.dll ## to C:\Users\tamel\AppData\Local\R\win-library\4.2\readr\libs\x64\readr.dll: ## Permission denied
## Warning: restored 'readr' ## package 'readxl' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'readxl' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\readxl\libs\x64\readxl.dll ## to C:\Users\tamel\AppData\Local\R\win-library\4.2\readxl\libs\x64\readxl.dll: ## Permission denied ## Warning: restored 'readxl' ## package 'tidyr' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'tidyr' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\tidyr\libs\x64\tidyr.dll ## to C:\Users\tamel\AppData\Local\R\win-library\4.2\tidyr\libs\x64\tidyr.dll: ## Permission denied ## Warning: restored 'tidyr' ## package 'tidyverse' successfully unpacked and MD5 sums checked ## ## The downloaded binary packages are in ## C:\Users\tamel\AppData\Local\Temp\Rtmp8ApiIo\downloaded_packages install.packages("caret",repos = "http://cran.us.r-project.org") ## Installing package into 'C:/Users/tamel/AppData/Local/R/win-library/4.2' ## (as 'lib' is unspecified) ## package 'caret' successfully unpacked and MD5 sums checked ## Warning: cannot remove prior installation of package 'caret' ## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying ## C:\Users\tamel\AppData\Local\R\win-library\4.2\00LOCK\caret\libs\x64\caret.dll ## to C:\Users\tamel\AppData\Local\R\win-library\4.2\caret\libs\x64\caret.dll: ## Permission denied ## Warning: restored 'caret'
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
## ## The downloaded binary packages are in ## C:\Users\tamel\AppData\Local\Temp\Rtmp8ApiIo\downloaded_packages install.packages("ggplot2",repos = "http://cran.us.r-project.org") ## Installing package into 'C:/Users/tamel/AppData/Local/R/win-library/4.2' ## (as 'lib' is unspecified) ## package 'ggplot2' successfully unpacked and MD5 sums checked ## ## The downloaded binary packages are in ## C:\Users\tamel\AppData\Local\Temp\Rtmp8ApiIo\downloaded_packages library (xgboost) ## Warning: package 'xgboost' was built under R version 4.2.3 library (tidyverse) ## Warning: package 'tidyverse' was built under R version 4.2.3 ## Warning: package 'ggplot2' was built under R version 4.2.3 ## Warning: package 'tibble' was built under R version 4.2.3 ## Warning: package 'dplyr' was built under R version 4.2.3 ## Warning: package 'forcats' was built under R version 4.2.3 ## ── Attaching core tidyverse packages ──────────────────────── tidy verse 2.0.0 ── ## dplyr 1.1.3 readr 2.1.3 ## forcats 1.0.0 stringr 1.5.0 ## ggplot2 3.4.4 tibble 3.2.1 ## lubridate 1.9.0 tidyr 1.2.1 ## purrr 1.0.0 ## ── Conflicts ────────────────────────────────────── ──── tidyverse_conflicts() ── ## dplyr::filter() masks stats::filter() ## dplyr::lag() masks stats::lag() ## dplyr::slice() masks xgboost::slice() ## Use the ]8;;http://conflicted.r-lib.org/conflicted package]8;; to force all conflicts to b ecome errors
library (caret) ## Warning: package 'caret' was built under R version 4.2.3 ## Loading required package: lattice ## ## Attaching package: 'caret' ## ## The following object is masked from 'package:purrr': ## ## lift library (ggplot2) oj<- read.csv("oj.csv") oj$logprice <- log(oj$price) #Tropicana d.data<- subset(oj, brand== "dominicks") t.data<- subset(oj, brand== "tropicana") m.data<- subset(oj, brand== "minute.maid") trainIndex.t<- createDataPartition(t.data$logmove, p = .8, list = FALSE) train.2.t<- t.data[trainIndex.t, ] test.2.t<- t.data[-trainIndex.t, ] holdoutindex.t<- createDataPartition(t.data$logmove, p = .2, list = FALSE) hold.out.set.t<- t.data[holdoutindex.t, ] trainIndex<- createDataPartition(oj$logmove, p = .8, list = FALSE) train.2<- oj[trainIndex, ] test.2<- oj[-trainIndex, ] holdoutindex<- createDataPartition(oj$logmove, p = .2, list = FALSE) hold.out.set<- oj[holdoutindex, ] train_matrix <- xgb.DMatrix(data = model.matrix(logmove ~ ., data = train.2), label = train.2$l ogmove) test_matrix <- xgb.DMatrix(data = model.matrix(logmove ~ ., data = test.2), label = test.2$logm ove) cv.result<- xgb.cv(data=train_matrix, nfold = 5, nrounds = 100, early_stopping_rounds = 10, pri nt_every_n = 10)
## [1] train-rmse:6.131015+0.001529 test-rmse:6.131164+0.007133 ## Multiple eval metrics are present. Will use test_rmse for early stopping. ## Will train until test_rmse hasn't improved in 10 rounds. ## ## [11] train-rmse:0.527552+0.003463 test-rmse:0.545752+0.008225 ## [21] train-rmse:0.427824+0.005091 test-rmse:0.457881+0.011820 ## [31] train-rmse:0.382092+0.004836 test-rmse:0.422614+0.008610 ## [41] train-rmse:0.353555+0.004424 test-rmse:0.403614+0.007545 ## [51] train-rmse:0.330745+0.003537 test-rmse:0.389620+0.008314 ## [61] train-rmse:0.313885+0.003379 test-rmse:0.381135+0.007393 ## [71] train-rmse:0.299697+0.002646 test-rmse:0.374183+0.005546 ## [81] train-rmse:0.288748+0.002386 test-rmse:0.370619+0.005719 ## [91] train-rmse:0.277840+0.002094 test-rmse:0.366893+0.005057 ## [100] train-rmse:0.269881+0.001634 test-rmse:0.364993+0.004955 cv<- print(cv.result)
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
## ##### xgb.cv 5-folds ## iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std ## 1 6.1310146 0.001528927 6.1311641 0.007133327 ## 2 4.3189396 0.001176663 4.3192538 0.007361533 ## 3 3.0577991 0.001293721 3.0584020 0.006771437 ## 4 2.1833478 0.001038107 2.1842449 0.007036808 ## 5 1.5838092 0.001080686 1.5863963 0.007733610 ## 6 1.1785413 0.002330926 1.1838889 0.007722081 ## 7 0.9094221 0.003247921 0.9172981 0.010063416 ## 8 0.7368662 0.002334897 0.7473863 0.009413106 ## 9 0.6328520 0.003529556 0.6466624 0.009571812 ## 10 0.5694855 0.003511656 0.5857575 0.011787394 ## 11 0.5275520 0.003463131 0.5457516 0.008224629 ## 12 0.5033326 0.004103605 0.5236374 0.010240461 ## 13 0.4848195 0.003623451 0.5065066 0.008299162 ## 14 0.4754906 0.003439362 0.4985664 0.008250153 ## 15 0.4628509 0.004756031 0.4878985 0.009561998 ## 16 0.4561565 0.003521512 0.4818895 0.010460812 ## 17 0.4491837 0.004584692 0.4756258 0.013000249 ## 18 0.4441768 0.003827011 0.4715887 0.010942828 ## 19 0.4385474 0.003856699 0.4670827 0.010500025 ## 20 0.4306677 0.005071367 0.4597989 0.011823772 ## 21 0.4278242 0.005090986 0.4578808 0.011819889 ## 22 0.4218273 0.005459314 0.4532085 0.012040128 ## 23 0.4162016 0.005907938 0.4482209 0.011478307 ## 24 0.4116567 0.005354689 0.4448070 0.009616012 ## 25 0.4068598 0.006982803 0.4415428 0.011129011 ## 26 0.4025115 0.005743955 0.4382017 0.011833421 ## 27 0.3988830 0.005594822 0.4352970 0.011680791 ## 28 0.3946829 0.006277053 0.4320059 0.011929748 ## 29 0.3907091 0.006726852 0.4289470 0.010210083 ## 30 0.3875080 0.005956206 0.4270101 0.010251166 ## 31 0.3820916 0.004835570 0.4226143 0.008610337 ## 32 0.3779062 0.005952378 0.4192374 0.009411091 ## 33 0.3740866 0.004819316 0.4163374 0.009537613 ## 34 0.3701302 0.005501539 0.4133290 0.008378869 ## 35 0.3680552 0.005325773 0.4125066 0.008121731 ## 36 0.3657731 0.005398593 0.4113646 0.008833106 ## 37 0.3639629 0.005435405 0.4103970 0.009045960 ## 38 0.3620130 0.005403705 0.4091835 0.009329199 ## 39 0.3588904 0.004083439 0.4071909 0.008603923 ## 40 0.3564066 0.004656638 0.4056015 0.008707407 ## 41 0.3535550 0.004423557 0.4036142 0.007545433 ## 42 0.3514529 0.003418598 0.4021676 0.007455252 ## 43 0.3490088 0.002844755 0.4004922 0.007590789 ## 44 0.3463554 0.003556065 0.3989692 0.008248717 ## 45 0.3446751 0.003412924 0.3982673 0.008230637 ## 46 0.3420439 0.003361708 0.3963983 0.007990653 ## 47 0.3394611 0.002953671 0.3947801 0.007753215 ## 48 0.3373679 0.002702819 0.3936770 0.007639134 ## 49 0.3346698 0.003770509 0.3918326 0.008698403 ## 50 0.3328580 0.003581993 0.3907514 0.008384434 ## 51 0.3307455 0.003536641 0.3896197 0.008314281 ## 52 0.3293247 0.003500093 0.3890861 0.008178658 ## 53 0.3278763 0.003658983 0.3884043 0.008324179
## 54 0.3253112 0.003759934 0.3868819 0.007965379 ## 55 0.3235653 0.003918639 0.3856596 0.007802413 ## 56 0.3223548 0.003213677 0.3851826 0.007784358 ## 57 0.3201593 0.003084733 0.3836369 0.007366936 ## 58 0.3188571 0.003436422 0.3830777 0.007389808 ## 59 0.3170540 0.003498537 0.3822968 0.007419580 ## 60 0.3154621 0.003521409 0.3817651 0.007380854 ## 61 0.3138850 0.003379127 0.3811351 0.007392535 ## 62 0.3124889 0.003582904 0.3804783 0.007328328 ## 63 0.3111434 0.003161461 0.3796457 0.007422875 ## 64 0.3097142 0.003401538 0.3790853 0.007113912 ## 65 0.3085324 0.003320419 0.3787491 0.007270650 ## 66 0.3068792 0.002874254 0.3778223 0.006548395 ## 67 0.3055809 0.003001237 0.3773502 0.006638890 ## 68 0.3040955 0.002377960 0.3764814 0.006210546 ## 69 0.3027573 0.002709601 0.3757641 0.006103503 ## 70 0.3013276 0.002453992 0.3752637 0.005995813 ## 71 0.2996975 0.002646160 0.3741829 0.005545507 ## 72 0.2986923 0.002434853 0.3736878 0.005596718 ## 73 0.2972548 0.002600135 0.3731176 0.005430403 ## 74 0.2962991 0.002401168 0.3727827 0.005498261 ## 75 0.2954206 0.002387673 0.3726647 0.005432090 ## 76 0.2939256 0.002458804 0.3721211 0.005549039 ## 77 0.2931490 0.002152299 0.3718674 0.005552641 ## 78 0.2917141 0.002238843 0.3712930 0.005515851 ## 79 0.2907468 0.002556345 0.3709783 0.005500511 ## 80 0.2897977 0.002552163 0.3707825 0.005451210 ## 81 0.2887485 0.002385966 0.3706192 0.005719326 ## 82 0.2876018 0.002294711 0.3700658 0.005563663 ## 83 0.2866006 0.002224241 0.3697344 0.005366222 ## 84 0.2851722 0.002379157 0.3692099 0.005313533 ## 85 0.2837413 0.001930612 0.3686710 0.005339493 ## 86 0.2825546 0.002251104 0.3683565 0.005309091 ## 87 0.2815955 0.001942591 0.3681712 0.005504863 ## 88 0.2808542 0.002118296 0.3679074 0.005427003 ## 89 0.2798571 0.002332985 0.3675706 0.005281276 ## 90 0.2787966 0.002226347 0.3670856 0.005020370 ## 91 0.2778405 0.002094254 0.3668932 0.005057471 ## 92 0.2769032 0.002442691 0.3666880 0.005012357 ## 93 0.2759327 0.002677367 0.3664409 0.004983866 ## 94 0.2751575 0.002760987 0.3662149 0.004857257 ## 95 0.2743090 0.002461285 0.3661806 0.004883775 ## 96 0.2734266 0.002276728 0.3660867 0.004974524 ## 97 0.2725614 0.002150720 0.3658288 0.005061496 ## 98 0.2715736 0.002169920 0.3656207 0.005062548 ## 99 0.2708388 0.001813743 0.3654077 0.005103194 ## 100 0.2698805 0.001634259 0.3649934 0.004955124 ## iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std ## Best iteration: ## iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std ## 100 0.2698805 0.001634259 0.3649934 0.004955124 cv
## ##### xgb.cv 5-folds ## iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std ## 1 6.1310146 0.001528927 6.1311641 0.007133327 ## 2 4.3189396 0.001176663 4.3192538 0.007361533 ## 3 3.0577991 0.001293721 3.0584020 0.006771437 ## 4 2.1833478 0.001038107 2.1842449 0.007036808 ## 5 1.5838092 0.001080686 1.5863963 0.007733610 ## 6 1.1785413 0.002330926 1.1838889 0.007722081 ## 7 0.9094221 0.003247921 0.9172981 0.010063416 ## 8 0.7368662 0.002334897 0.7473863 0.009413106 ## 9 0.6328520 0.003529556 0.6466624 0.009571812 ## 10 0.5694855 0.003511656 0.5857575 0.011787394 ## 11 0.5275520 0.003463131 0.5457516 0.008224629 ## 12 0.5033326 0.004103605 0.5236374 0.010240461 ## 13 0.4848195 0.003623451 0.5065066 0.008299162 ## 14 0.4754906 0.003439362 0.4985664 0.008250153 ## 15 0.4628509 0.004756031 0.4878985 0.009561998 ## 16 0.4561565 0.003521512 0.4818895 0.010460812 ## 17 0.4491837 0.004584692 0.4756258 0.013000249 ## 18 0.4441768 0.003827011 0.4715887 0.010942828 ## 19 0.4385474 0.003856699 0.4670827 0.010500025 ## 20 0.4306677 0.005071367 0.4597989 0.011823772 ## 21 0.4278242 0.005090986 0.4578808 0.011819889 ## 22 0.4218273 0.005459314 0.4532085 0.012040128 ## 23 0.4162016 0.005907938 0.4482209 0.011478307 ## 24 0.4116567 0.005354689 0.4448070 0.009616012 ## 25 0.4068598 0.006982803 0.4415428 0.011129011 ## 26 0.4025115 0.005743955 0.4382017 0.011833421 ## 27 0.3988830 0.005594822 0.4352970 0.011680791 ## 28 0.3946829 0.006277053 0.4320059 0.011929748 ## 29 0.3907091 0.006726852 0.4289470 0.010210083 ## 30 0.3875080 0.005956206 0.4270101 0.010251166 ## 31 0.3820916 0.004835570 0.4226143 0.008610337 ## 32 0.3779062 0.005952378 0.4192374 0.009411091 ## 33 0.3740866 0.004819316 0.4163374 0.009537613 ## 34 0.3701302 0.005501539 0.4133290 0.008378869 ## 35 0.3680552 0.005325773 0.4125066 0.008121731 ## 36 0.3657731 0.005398593 0.4113646 0.008833106 ## 37 0.3639629 0.005435405 0.4103970 0.009045960 ## 38 0.3620130 0.005403705 0.4091835 0.009329199 ## 39 0.3588904 0.004083439 0.4071909 0.008603923 ## 40 0.3564066 0.004656638 0.4056015 0.008707407 ## 41 0.3535550 0.004423557 0.4036142 0.007545433 ## 42 0.3514529 0.003418598 0.4021676 0.007455252 ## 43 0.3490088 0.002844755 0.4004922 0.007590789 ## 44 0.3463554 0.003556065 0.3989692 0.008248717 ## 45 0.3446751 0.003412924 0.3982673 0.008230637 ## 46 0.3420439 0.003361708 0.3963983 0.007990653 ## 47 0.3394611 0.002953671 0.3947801 0.007753215 ## 48 0.3373679 0.002702819 0.3936770 0.007639134 ## 49 0.3346698 0.003770509 0.3918326 0.008698403 ## 50 0.3328580 0.003581993 0.3907514 0.008384434 ## 51 0.3307455 0.003536641 0.3896197 0.008314281 ## 52 0.3293247 0.003500093 0.3890861 0.008178658 ## 53 0.3278763 0.003658983 0.3884043 0.008324179
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
## 54 0.3253112 0.003759934 0.3868819 0.007965379 ## 55 0.3235653 0.003918639 0.3856596 0.007802413 ## 56 0.3223548 0.003213677 0.3851826 0.007784358 ## 57 0.3201593 0.003084733 0.3836369 0.007366936 ## 58 0.3188571 0.003436422 0.3830777 0.007389808 ## 59 0.3170540 0.003498537 0.3822968 0.007419580 ## 60 0.3154621 0.003521409 0.3817651 0.007380854 ## 61 0.3138850 0.003379127 0.3811351 0.007392535 ## 62 0.3124889 0.003582904 0.3804783 0.007328328 ## 63 0.3111434 0.003161461 0.3796457 0.007422875 ## 64 0.3097142 0.003401538 0.3790853 0.007113912 ## 65 0.3085324 0.003320419 0.3787491 0.007270650 ## 66 0.3068792 0.002874254 0.3778223 0.006548395 ## 67 0.3055809 0.003001237 0.3773502 0.006638890 ## 68 0.3040955 0.002377960 0.3764814 0.006210546 ## 69 0.3027573 0.002709601 0.3757641 0.006103503 ## 70 0.3013276 0.002453992 0.3752637 0.005995813 ## 71 0.2996975 0.002646160 0.3741829 0.005545507 ## 72 0.2986923 0.002434853 0.3736878 0.005596718 ## 73 0.2972548 0.002600135 0.3731176 0.005430403 ## 74 0.2962991 0.002401168 0.3727827 0.005498261 ## 75 0.2954206 0.002387673 0.3726647 0.005432090 ## 76 0.2939256 0.002458804 0.3721211 0.005549039 ## 77 0.2931490 0.002152299 0.3718674 0.005552641 ## 78 0.2917141 0.002238843 0.3712930 0.005515851 ## 79 0.2907468 0.002556345 0.3709783 0.005500511 ## 80 0.2897977 0.002552163 0.3707825 0.005451210 ## 81 0.2887485 0.002385966 0.3706192 0.005719326 ## 82 0.2876018 0.002294711 0.3700658 0.005563663 ## 83 0.2866006 0.002224241 0.3697344 0.005366222 ## 84 0.2851722 0.002379157 0.3692099 0.005313533 ## 85 0.2837413 0.001930612 0.3686710 0.005339493 ## 86 0.2825546 0.002251104 0.3683565 0.005309091 ## 87 0.2815955 0.001942591 0.3681712 0.005504863 ## 88 0.2808542 0.002118296 0.3679074 0.005427003 ## 89 0.2798571 0.002332985 0.3675706 0.005281276 ## 90 0.2787966 0.002226347 0.3670856 0.005020370 ## 91 0.2778405 0.002094254 0.3668932 0.005057471 ## 92 0.2769032 0.002442691 0.3666880 0.005012357 ## 93 0.2759327 0.002677367 0.3664409 0.004983866 ## 94 0.2751575 0.002760987 0.3662149 0.004857257 ## 95 0.2743090 0.002461285 0.3661806 0.004883775 ## 96 0.2734266 0.002276728 0.3660867 0.004974524 ## 97 0.2725614 0.002150720 0.3658288 0.005061496 ## 98 0.2715736 0.002169920 0.3656207 0.005062548 ## 99 0.2708388 0.001813743 0.3654077 0.005103194 ## 100 0.2698805 0.001634259 0.3649934 0.004955124 ## iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std ## Best iteration: ## iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std ## 100 0.2698805 0.001634259 0.3649934 0.004955124 Including Plots You can also embed plots, for example:
Note that the echo = FALSE parameter was added to the code chunk to prevent printing of the R code that generated the plot.