1 Description

This report is subset into 3 parts:

1.1 Clinical Context

Computed Tomography scanner (CT scan) is a widely spread and popular exam in oncology: it reflects the density of the tissues of the human body. It is, then, adapted to the study of lung cancer because lungs are mostly filled with air (low density) while tumors are made of dense tissues.

1.2 Clinical context

Small Cell Lung Cancer can itself be split into four major subtypes based on histology observations: squamous cell carcinoma, large cell carcinoma, adenocarcinoma and a mixture of all

1.3 Goal

Predict the survival time of a patient (remaining days to live) from one three-dimensional CT scan (grayscale image) and a set of pre-extracted quantitative imaging features, as well as clinical data.

1.4 dataset

To each patient corresponds one CT scan, and one binary segmentation mask. The segmentation mask is a binary volume of the same size as the CT scan, except that it is composed of zeroes everywhere there is no tumour, and 1 otherwise. The CT scans and the associated segmentation masks are subsets of two public datasets:

  • NSCLC Radiomics (subset of 285 patients)
  • NSCLC RadioGenomics(subset of 141 patients)

Both training and validation contain for each patient, the time to event (days), as well as the censorship. Censorship indicates whether the event (death) was observed or whether the patient escaped the study: this can happen when the patient’s track was lost, or if the patient died of causes not related to the disease.

2 Tumor Arrays Slides Exploration

2.1 Setting python version and anaconda environment for R :-)

reticulate::use_python("/Library/Frameworks/Python.framework/Versions/3.7/bin/python3", required = TRUE)
#reticulate::use_python("/Users/Mezhoud/anaconda3/bin/python3", required = TRUE)
reticulate::py_config()
## python:         /Library/Frameworks/Python.framework/Versions/3.7/bin/python3
## libpython:      /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/config-3.7m-darwin/libpython3.7.dylib
## pythonhome:     /Library/Frameworks/Python.framework/Versions/3.7:/Library/Frameworks/Python.framework/Versions/3.7
## version:        3.7.3 (v3.7.3:ef4ec6ed12, Mar 25 2019, 16:52:21)  [Clang 6.0 (clang-600.0.57)]
## numpy:          /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/numpy
## numpy_version:  1.18.1
## 
## NOTE: Python version was forced by use_python function
#pip3 install --user sklearn
knitr::opts_chunk$set(engine.path = list(
  python = '/Library/Frameworks/Python.framework/Versions/3.7/bin/python3'
))

# check xgboost module 
reticulate::py_module_available("seaborn")
## [1] TRUE
#reticulate::install_miniconda("xgboost")

2.2 Load scans and masks of Tumor lung cancer

import numpy as np
from matplotlib import pyplot as plt
#from matplotlib import pyplot
from PIL import Image

img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']

print("the dimension of scan array is: ", str(scan.shape))
## the dimension of scan array is:  (92, 92, 92)
print("the dimension of mask array is: ", str(mask.shape))
## the dimension of mask array is:  (92, 92, 92)
print("plot some images from patient 002: ")
#plt.imshow(scan[:, :, 3])
## plot some images from patient 002:
f, axarr = plt.subplots(2,3)
axarr[0,0].imshow(scan[1:92, 1:92, 0])
axarr[1,0].imshow(mask[1:92, 1:92, 0])
axarr[0,1].imshow(scan[:, :, 3])
axarr[1,1].imshow(mask[:, :, 3])
axarr[0,2].imshow(scan[:, :, 80])
axarr[1,2].imshow(mask[:, :, 80])

2.2.1 Function to plot multiple image from array


def plot_figures(figures, nrows = 1, ncols=1):
  """Plot a dictionary of figures.

  Parameters
  ----------
  figures : <title, figure> dictionary
  ncols : number of columns of subplots wanted in the display
  nrows : number of rows of subplots wanted in the figure
  """
  fig, axeslist = plt.subplots(ncols=ncols, nrows=nrows)
  for ind,title in zip(range(len(figures)), figures):
      axeslist.ravel()[ind].imshow(figures[title], cmap=plt.jet())
      axeslist.ravel()[ind].set_title(title)
      axeslist.ravel()[ind].set_axis_off()
  plt.tight_layout() 


img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']


# generation of a dictionary of (title, images)
number_of_im = 6
scan = {'scan'+str(i): scan[1:92, 1:92, i] for i in range(number_of_im)}

# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(scan, 2, 3)
plt.show()

The plot shows colored images scan of 6 slides. At this step it is not easy to distinguish the tumor.

The dataset has aslo the masks for each scan slide which locate the position of the tumor in the scan.

mask = {'mask'+str(i): mask[1:92, 1:92, i] for i in range(number_of_im)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(mask, 2, 3)
plt.show()

  • The first 3 slides do not have tumor streak, however the next 3 ones indicate the position of the tumor in red color.

  • If we plot more slides, we can observe the increase of the size of the tumor during plotting slides.

  • At the end the size the Tumor is decreasing.

  • We can note that the crop is adjusted to the size of the tumor


img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']

mask = {'mask'+str(i): mask[1:92, 1:92, i] for i in range(90)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(mask, 9, 10)
plt.show()

  • We can select the representative mask per patient by the sum of pixels (0,1). The matrix with higher value can be selected to represent owner patient.

If we compare with the scan slides, we obtain:

scan = {'scan'+str(i): scan[1:92, 1:92, i] for i in range(90)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(scan, 9, 10)
plt.show()

  • It is always not easy to delimit the tumor in scan images

  • Comparing to masks, we can note that, between scan 34 and scan 65, the slides have more yellow stain or less blue color.

2.2.2 Superimposing Scan and Mask images

import numpy as np
from matplotlib import pyplot as plt
from PIL import Image

img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']


background = mask[1:92, 1:92, 56]
overlay = scan[1:92, 1:92, 56]

plt.title("Scan/Mask: 56")
plt.imshow(background, cmap='gray')
plt.imshow(overlay, cmap='jet', alpha=0.9)

  • It is now clear that masks seems to be more useful that scans because the tumor in not visible in scan slides.

2.3 Load images from test dataset

img_array = np.load('test/images/patient_001.npz')
scan = img_array['scan']
mask = img_array['mask']


# generation of a dictionary of (title, images)
number_of_im = 90
scan = {'scan'+str(i): scan[1:92, 1:92, i] for i in range(number_of_im)}

# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(scan, 9, 10)
plt.show()

  • Plot mask slides from test dataset
mask = {'mask'+str(i): mask[1:92, 1:92, i] for i in range(number_of_im)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(mask, 9, 10)
plt.show()

  • Superimpose mask and scan slide from test dataset
img_array = np.load('test/images/patient_001.npz')
scan = img_array['scan']
mask = img_array['mask']

background = mask[1:92, 1:92, 35]
overlay = scan[1:92, 1:92, 35]

plt.title("Scan/Mask: 35")
plt.imshow(background, cmap='gray')
plt.imshow(overlay, cmap='jet', alpha=0.9)

  • In the test images, we can also observe tumor slides like in train dataset.

  • For training step, it maybe better to use masks slides than scan. But we need to explore variables in clinical data and radiomics and think how to associate images with numeric variables.

  • One think we can do is the convert slides to dataframe (each slide in one row) and then we can obtain one matrix for each patient tumor.

  • At this step I will switch from python to R :-)

2.4 Import image from python environment to R

The goal of this step is to convert image matrices as a vector. So, each image can be ranged in one row. Finally, we can obtain one dataframe with 92 rows (images) for each sample (patient).

2.4.1 Import useful R packages

2.4.1.1 Useful python function


import numpy as np

def load_img_array(file):
  im_array = np.load(file)
  scan = im_array['scan']
  mask = im_array['mask']
  return scan,mask

2.4.1.2 Understanding the structure of the array of images

patient_002 <- reticulate::py$load_img_array('train/images/patient_002.npz')

paste0("One image is a: ", class(patient_002[[1]][,,1]))
## [1] "One image is a: matrix"
paste0("Two images are an: ", class(patient_002[[1]][,,1:2]))
## [1] "Two images are an: array"
paste0("Print the first 10 pixels of Scan N°1: "); patient_002[[1]][,,1][1:10, 1:10]
## [1] "Print the first 10 pixels of Scan N°1: "
##       [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
##  [1,] -777 -759 -707 -697 -749 -796 -826 -837 -858  -860
##  [2,] -783 -791 -774 -768 -787 -808 -827 -826 -829  -829
##  [3,] -804 -841 -827 -812 -820 -840 -831 -801 -792  -794
##  [4,] -830 -857 -839 -816 -805 -818 -801 -764 -734  -722
##  [5,] -844 -854 -843 -823 -799 -787 -771 -743 -704  -670
##  [6,] -844 -849 -844 -831 -821 -810 -809 -791 -760  -719
##  [7,] -848 -847 -848 -844 -847 -841 -849 -841 -829  -806
##  [8,] -847 -856 -854 -846 -840 -813 -826 -849 -848  -836
##  [9,] -840 -851 -836 -823 -835 -796 -803 -853 -857  -833
## [10,] -847 -841 -829 -817 -860 -832 -799 -842 -865  -838
paste0("Print the first 10 pixels of Mask N°1: "); patient_002[[2]][,,3][1:10, 1:10]
## [1] "Print the first 10 pixels of Mask N°1: "
##        [,1]  [,2]  [,3]  [,4]  [,5]  [,6]  [,7]  [,8]  [,9] [,10]
##  [1,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [2,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [3,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [4,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [5,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [6,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [7,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [8,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [9,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
## [10,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE

2.4.1.3 Convert the array of matrices to a list of matrices

ls_scan_patient_002 <- lapply(seq(dim(patient_002[[1]])[3]), function(x) patient_002[[1]][ , , x])
ls_mask_patient_002 <- lapply(seq(dim(patient_002[[2]])[3]), function(x) patient_002[[2]][ , , x])

paste0("The dimension of the scan images is: ", length(ls_scan_patient_002))
## [1] "The dimension of the scan images is: 92"
paste0("The dimension of the mask images is: ", length(ls_mask_patient_002))
## [1] "The dimension of the mask images is: 92"

2.4.1.4 Convert image matrix to vector

mat2vec <- function(path, target = 'mask'){
  
  # Load patient CT scan
  patient <- py$load_img_array(path)
  
    #### For scans
  if('scan' %in% target){
  # list scans
  scan <- lapply(seq(dim(patient[[1]])[3]), function(x) patient[[1]][ , , x])
    # vectorize each matrix (image) into vector
  vec_scan <- lapply(scan, function(x) as.vector(x))
   # bind vector into dataframe by row
  df_scan <-as.data.frame( do.call(rbind, vec_scan)) 
   # extract patient_id from path
  scan_id <- paste0(tools::file_path_sans_ext(basename(path)), "_scan")
  
  }
    
  #### For Masks
  if('mask' %in% target){

  # list masks
  mask <- lapply(seq(dim(patient[[2]])[3]), function(x) patient[[2]][ , , x])
  # vectorise each mask (image) to vector
  vec_mask <- lapply(mask, function(x) as.vector(x))
  # bind vector into dataframe by row
  df_mask <-as.data.frame( do.call(rbind, vec_mask)) 
  
   # extract patient_id from path
  mask_id <- paste0(tools::file_path_sans_ext(basename(path)), "_mask")
  }
  
  if("scan" %in% target && "mask" %in% target){
  # group in list the scan and the mask dataframes
  ls <- list(df_scan, df_mask)

  # Rename list
  names(ls) <- c(scan_id, mask_id)
  
  return(ls)
  
  } else if(target == "scan"){
    
    names(df_scan) <- scan_id
    
    return(df_scan)
    
  }else{
    #ls <- list(df_mask)
    #names(ls) <- mask_id
    return(df_mask)
  }
}

patient2 <- mat2vec('train/images/patient_002.npz', target = c("mask", "scan"))

paste0("The output is a: ", class(patient2))
## [1] "The output is a: list"
paste0("With length of: ", length(patient2))
## [1] "With length of: 2"
paste0("The names of two elements are: ") ; names(patient2)
## [1] "The names of two elements are: "
## [1] "patient_002_scan" "patient_002_mask"
paste0("which are: ", class(patient2$patient_002_scan))
## [1] "which are: data.frame"
paste0("The dimension of each dataframe is: ") ; dim(patient2$patient_002_scan)
## [1] "The dimension of each dataframe is: "
## [1]   92 8464
  • We think to use only masks for modeling

  • Potential method: keras, mxnet

  • The training dataset must contain 92 rows corresponding to 92 masks of each patient. The first column is the Event (target) of each patient. The remain columns are the pixel intensity or in our case (0,1) tags which delimit the tumor if exists.

2.4.1.5 Select only the best mask for each Patient

We consider that the mask that has more TRUE is the best representative slide for patient. The idea is to convert all matrices to vectors and count the value TRUE for each row.

masks2 <- mat2vec('train/images/patient_002.npz', target = c("mask"))

## display the sum of TRUE in each row for Patient002
paste("Number of TRUE in each slide for Patient_002:") ; rowSums(masks2)
## [1] "Number of TRUE in each slide for Patient_002:"
##  [1]    0    0    0   55   60   60  195  209  209  504  548  548 1158 1256 1276
## [16] 1737 1809 1835 2353 2429 2474 2870 2920 2960 3225 3251 3280 3537 3567 3546
## [31] 3619 3619 3608 3558 3557 3557 3537 3534 3534 3664 3668 3671 3872 3911 3928
## [46] 3994 4025 4026 3818 3822 3806 3553 3492 3495 3552 3520 3503 2973 2913 2887
## [61] 2080 2024 1988 1479 1479 1460 1257 1293 1289 1175 1192 1199  663  679  674
## [76]  901  922  924  771  741  755  745  671  654  622  255  237  206    0    0
## [91]    0    0
get_best_mask <- function(masks){
# Display the biggest row with TRUE value
Case <- masks[sort(rowSums(masks), index=TRUE, decreasing=TRUE)$ix, ][1,]
#d <- cbind(`Patients` = as.character(names(masks)),Case)
return(Case)
}

masks2[sort(rowSums(masks2), index=TRUE, decreasing=TRUE)$ix, ][1,][1:10]
##       V1    V2    V3    V4    V5    V6    V7    V8    V9   V10
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE

2.4.1.6 Convert all images to matrices, each matrix corresponds to one patient

require(stringr)
require(data.table)


get_best_masks_img2mat <- function(path){

list_path <- as.list(list.files(path,full.names = TRUE))

list_matrices <- lapply(list_path, function(x) mat2vec(x, target = c("mask")))

best_masks <- lapply(list_matrices, function(x) get_best_mask(x))

nm <- lapply(list_path, function(x) str_remove(tools::file_path_sans_ext(basename(x)), "patient_"))

#names(best_masks) <- as.numeric(nm)

#img_df <- bind_rows(best_masks)
img_df <- data.table::rbindlist(best_masks)*1

row.names(img_df) <- nm

df_img <- tibble::rownames_to_column(img_df, "PatientID")

return(df_img)
}

img_tr <- get_best_masks_img2mat("train/images/")
img_ts <- get_best_masks_img2mat("test/images/")

output_train <- data.table::fread("output_train.csv")
output_test <- data.table::fread("output_test.csv")

train_img <-  img_tr %>%
    mutate(PatientID = as.numeric(PatientID)) %>%
    left_join(output_train, by = "PatientID") %>%
    select(PatientID,  SurvivalTime, Event, everything()) #%>%
  #setDT() # convert to data.table


test_img <-  img_ts %>%
    mutate(PatientID = as.numeric(PatientID)) %>%
    left_join(output_test, by = "PatientID") %>%
    select(PatientID, SurvivalTime, Event, everything())#%>%
  #setDT() # convert to data.table



train_img[1:5,1:10]
##   PatientID SurvivalTime Event V1 V2 V3 V4 V5 V6 V7
## 1         2          638     0  0  0  0  0  0  0  0
## 2         3          421     0  0  0  0  0  0  0  0
## 3         4          465     1  0  0  0  0  0  0  0
## 4         5         1295     1  0  0  0  0  0  0  0
## 5         7         1393     0  0  0  0  0  0  0  0
data.table::fwrite(train_img , "train_img.csv")
data.table::fwrite(test_img , "test_img.csv")
#rowSums(train_img[,-1])
rm(list = ls())
invisible(gc())

3 R xgboost with best masks

3.1 Split Train dataset into Train & Valid sets

require(rsample)

train_img <- data.table::fread("train_img.csv")
test_img <- data.table::fread("test_img.csv")

# remove PatientID column
train_img[,PatientID:= NULL]
test_img[,Event :=NULL]

set.seed(100)
train_valid_split <- rsample::initial_split(train_img, prop = 0.8)
train_valid_split
## <241/59/300>
train_img[,1:7]
##      SurvivalTime Event V1 V2 V3 V4 V5
##   1:          638     0  0  0  0  0  0
##   2:          421     0  0  0  0  0  0
##   3:          465     1  0  0  0  0  0
##   4:         1295     1  0  0  0  0  0
##   5:         1393     0  0  0  0  0  0
##  ---                                  
## 296:          528     1  0  0  0  0  0
## 297:         1503     0  0  0  0  0  0
## 298:          315     0  0  0  0  0  0
## 299:          360     1  0  0  0  0  0
## 300:          515     1  0  0  0  0  0
  • We can retrieve our training and testing sets using training() and testing() functions.
# Retrieve train and test sets
train_8 <- rsample::training(train_valid_split)
valid_2  <- rsample::testing(train_valid_split)
train_8[1:10, 1:10]
##     SurvivalTime Event V1 V2 V3 V4 V5 V6 V7 V8
##  1:          638     0  0  0  0  0  0  0  0  0
##  2:          421     0  0  0  0  0  0  0  0  0
##  3:          465     1  0  0  0  0  0  0  0  0
##  4:         1393     0  0  0  0  0  0  0  0  0
##  5:         1076     1  0  0  0  0  0  0  0  0
##  6:           87     1  0  0  0  0  0  0  0  0
##  7:           98     1  0  0  0  0  0  0  0  0
##  8:          316     0  0  0  0  0  0  0  0  0
##  9:           65     1  0  0  0  0  0  0  0  0
## 10:          476     1  0  0  0  0  0  0  0  0

3.2 Format train and test to DMatrix (R, Masks)

require(Matrix)
require(xgboost)


# the option na.pass avoids missing value in age column
options(na.action='na.pass')
train_8_sparse <- sparse.model.matrix(Event ~., data=train_8)
dtrain_8 <- xgb.DMatrix(data=train_8_sparse, label = train_8$Event)

options(na.action='na.pass')
valid_2_sparse <- sparse.model.matrix(Event ~., data=valid_2)
dvalid_2 <- xgb.DMatrix(data=valid_2_sparse, label = valid_2$Event)

3.3 Optimize features with Cross validation

Here, we can see after how many rounds, we achieved the smallest test error.

params <- list(booster = "gbtree",
              tree_method = "auto",
              objective = "binary:logistic",
              eval_metric = "auc",         #  for Binary classification error rate
              max_depth = 2,        # 6 makes training heavy, there is no correlation between features #1 is not better
              eta = 0.03,                     # learning rate
              subsample = 0.8,              # prevent overfitting
              colsample_bytree = 0.1         # specify the fraction of columns to be subsampled. # 0.5 is not better
             )


tme <- Sys.time()
cv_model <- xgb.cv(params = params,
                   data = dtrain_8,
                   nthread = parallel::detectCores(all.tests = FALSE, logical = TRUE),  #2,
                   nrounds = 25000,
                   verbose = TRUE,
                   nfold = 7,
                   print_every_n = 500,
                   early_stopping_rounds = 1000,
                   maximize = TRUE,
                   prediction = TRUE) # prediction of cv folds
## [1]  train-auc:0.682223+0.055012 test-auc:0.569252+0.097524 
## Multiple eval metrics are present. Will use test_auc for early stopping.
## Will train until test_auc hasn't improved in 1000 rounds.
## 
## [501]    train-auc:0.995171+0.000974 test-auc:0.731163+0.052885 
## [1001]   train-auc:0.999758+0.000141 test-auc:0.734070+0.056987 
## [1501]   train-auc:0.999919+0.000033 test-auc:0.731769+0.050985 
## Stopping. Best iteration:
## [607]    train-auc:0.997563+0.000629 test-auc:0.738315+0.050803
Sys.time() - tme
## Time difference of 2.06715 mins

3.4 Train the model

watchlist <- list(train = dtrain_8, eval = dvalid_2)
tme <- Sys.time()
xgboost_tree_img <- xgb.train(data = dtrain_8, 
                         params = params,
                         watchlist = watchlist,
                         nrounds = cv_model$best_iteration, # more than 12000 ~0.897
                         print_every_n = 500,
                         verbose = TRUE)
## [1]  train-auc:0.673327  eval-auc:0.594706 
## [501]    train-auc:0.992188  eval-auc:0.852941 
## [607]    train-auc:0.996197  eval-auc:0.851765
Sys.time() - tme
## Time difference of 8.565483 secs
## Predict valid_2 dataset

pred_valid_img <- predict(xgboost_tree_img, dvalid_2)

paste0('SUMMARY: '); summary(pred_valid_img)
## [1] "SUMMARY: "
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.1078  0.3356  0.6286  0.5908  0.8195  0.9572
# We suppose that if Prob > 0.5 (Median), the Event is 1, else 0
pred_bin_img <- as.numeric(pred_valid_img >= 0.5)

paste("RATIO: "); table(pred_bin_img)
## [1] "RATIO: "
## pred_bin_img
##  0  1 
## 21 38
## Confusion matrix for Tree model
data.frame(prediction = as.numeric(pred_bin_img),
         label = as.numeric(valid_2$Event)) %>%
         count(prediction, label)
## # A tibble: 4 x 3
##   prediction label     n
##        <dbl> <dbl> <int>
## 1          0     0    16
## 2          0     1     5
## 3          1     0     9
## 4          1     1    29

3.5 Prediction with (R, Masks)

test_sparse <- sparse.model.matrix(PatientID ~., data=test_img)
dtest <- xgb.DMatrix(data=test_sparse, label = test_img$PatientID)
 
pred_tree_img <- predict(xgboost_tree_img, dtest)

pred_bin_xgboost_img <- as.numeric(pred_tree_img >= 0.5)


print("ratio of predicted Events in test datset: "); table(pred_bin_xgboost_img)
## [1] "ratio of predicted Events in test datset: "
## pred_bin_xgboost_img
##  0  1 
## 54 71
print("ratio of Events in train datset: "); table(train_img$Event)
## [1] "ratio of Events in train datset: "
## 
##   0   1 
## 138 162
## submission

pred_img <- data.frame(
  PatientID = test_img$PatientID,
  Event = pred_tree_img
)

output_test <- fread("output_test.csv")

submission_img <- output_test %>%
  select(PatientID, SurvivalTime) %>%
  left_join(pred_img, by = "PatientID")

fwrite(submission_img, "submission_img.csv")

## Refresh memory
rm(list=ls())
invisible(gc())

4 Python xgboost with best masks

import matplotlib
import matplotlib.pyplot as plt
import pandas as pd
import numpy as np
import seaborn as sns
from sklearn.model_selection import train_test_split
import xgboost

test_img = pd.read_csv("test_img.csv")
train_img = pd.read_csv("train_img.csv")

# Set weed
SEED = 1423

#test_img.pop('PatientID')
#test_img.pop('Event')
#test_img.head()
# split data into train and test portions and model
features = [feat for feat in list(train_img) if feat != 'Event']
X_train, X_valid, y_train, y_valid = train_test_split(train_img[features], 
                                                 train_img[['Event']], 
                                                test_size=0.3, 
                                                 random_state=SEED)
 


import xgboost  as xgb
xgb_params = {
    'max_depth':3, 
    'eta':0.01, 
    'silent':0, 
    'eval_metric':'auc',
    'subsample': 0.8,
    'colsample_bytree': 0.8,
    'objective':'binary:logistic',
    'seed' : 1423
}

dtrain = xgb.DMatrix(X_train, y_train, feature_names=X_train.columns.values)
dvalid = xgb.DMatrix(X_valid, y_valid, feature_names=X_valid.columns.values)


evals = [(dtrain,'train'),(dvalid,'eval')]
xgb_model_img = xgb.train ( params = xgb_params,
              dtrain = dtrain,
              num_boost_round = 5000,
              verbose_eval=200, 
              early_stopping_rounds = 500,
              evals=evals,
              maximize = True)
## [0]  train-auc:0.847485  eval-auc:0.81475
## Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.
## 
## Will train until eval-auc hasn't improved in 500 rounds.
## [200]    train-auc:0.961917  eval-auc:0.85
## [400]    train-auc:0.983145  eval-auc:0.8545
## [600]    train-auc:0.995171  eval-auc:0.858
## [800]    train-auc:0.998907  eval-auc:0.857
## [1000]   train-auc:0.999818  eval-auc:0.8555
## Stopping. Best iteration:
## [615]    train-auc:0.995809  eval-auc:0.8595
# get dataframe version of important feature for model 
xgb_img_imp = pd.DataFrame(list(xgb_model_img.get_fscore().items()),
columns = ['feature','importance']).sort_values('importance', ascending=False)
xgb_img_imp.head(10)
##           feature  importance
## 0    SurvivalTime         899
## 14      PatientID         430
## 2           V4823          82
## 94          V4620          74
## 130         V6370          73
## 231         V3701          61
## 105         V3539          53
## 104         V4639          48
## 52          V5538          48
## 39          V6018          45
# Confusion matrix
dval_predictions = xgb_model_img.predict(dvalid)

from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_valid, [1 if p > 0.5 else 0 for p in dval_predictions])

plt.figure(figsize = (6,4))
plt.ticklabel_format(style='plain', axis='y', useOffset=False)
sns.set(font_scale=1.4)
sns.heatmap(cm, annot=True, annot_kws={"size": 16}) 
plt.show()

# evaluate predictions
from sklearn.metrics import accuracy_score

predictions_img = [round(value) for value in dval_predictions]

accuracy = accuracy_score(y_valid, predictions_img)

print("Accuracy: %.2f%%" % (accuracy * 100.0))
## Accuracy: 78.89%

4.1 Prediction (Python,Masks)


dtest = xgb.DMatrix(test_img.drop('PatientID', axis=1),  feature_names=X_train.columns.values)

dtest_predictions = xgb_model_img.predict(dtest)

dtest_predictions[dtest_predictions > 0.5] = 1
dtest_predictions[dtest_predictions <= 0.5] = 0

output_test = pd.read_csv("output_test.csv", index_col = False)

#submission_img_p['Event'] = np.where(submission_img_p['Event'] >0.5, 1, 0)


# Format predictions in DataFrame: prediction_df
#prediction_df = pd.DataFrame(PatientID = test_img['PatientID'] ,
#                             Event = predictions_img)
                             
                             
#output_test.join(submission_img.set_index('PatientID'), on='PatientID')

output_test['Event'] = dtest_predictions




output_test.to_csv("submission_img_p.csv", index=False)


## save predictions for an ensemble
#pickle.dump(Y_pred, open('xgb_train.pickle', 'wb'))
#pickle.dump(Y_test, open('xgb_test.pickle', 'wb'))

5 Exploratory Data Analysis of radiomics and clinical data

#Load dataset
radiomics <- fread("train/features/radiomics.csv", quote = "")
clinical <- fread("train/features/clinical_data.csv")

# display only 8 columns and 5 rows
head(radiomics)[,1:8]
##           V1                          V2                          V3
## 1:                                 shape                       shape
## 2:           original_shape_Compactness1 original_shape_Compactness2
## 3: PatientID                                                        
## 4:       202                 0.027815034                 0.274891585
## 5:       371                  0.02301549                 0.188210005
## 6:       246                 0.027348106                 0.265739895
##                                  V4                                    V5
## 1:                            shape                                 shape
## 2: original_shape_Maximum3DDiameter original_shape_SphericalDisproportion
## 3:                                                                       
## 4:                      48.55924217                           1.537964054
## 5:                      75.70336849                           1.744961158
## 6:                      70.43436661                           1.555420243
##                           V6                         V7
## 1:                     shape                      shape
## 2: original_shape_Sphericity original_shape_SurfaceArea
## 3:                                                     
## 4:               0.650210255                 5431.33321
## 5:               0.573078659                10369.56873
## 6:               0.642913068                10558.81869
##                                   V8
## 1:                             shape
## 2: original_shape_SurfaceVolumeRatio
## 3:                                  
## 4:                       0.275227763
## 5:                       0.240726824
## 6:                       0.200765988
head(clinical)
##    PatientID               Histology Mstage Nstage SourceDataset Tstage     age
## 1:       202          Adenocarcinoma      0      0            l2      2 66.0000
## 2:       371              large cell      0      2            l1      4 64.5722
## 3:       246 squamous cell carcinoma      0      3            l1      2 66.0452
## 4:       240                     nos      0      2            l1      3 59.3566
## 5:       284 squamous cell carcinoma      0      3            l1      4 71.0554
## 6:       348 squamous cell carcinoma      0      2            l1      2 65.0212

The radiomics features can be divided into 4 groups as follows (shown in row 1): - Group 1. First order statistics - Group 2. Shape and size based features - Group 3. Textural features - Group 4. Wavelet features

Each group can be subset into several sub-groups shown in row 2 of the radiomics dataset. To make the radiomics features numeric dataset we need to remove the two first rows and convert them to colnames.

groups <- radiomics[1:2,-1] %>%
  t() %>%
  as.data.frame() %>%
  rename("Groups" = V1, "Features" = V2) #%>%
#  remove_rownames()

head(groups)
##    Groups                              Features
## V2  shape           original_shape_Compactness1
## V3  shape           original_shape_Compactness2
## V4  shape      original_shape_Maximum3DDiameter
## V5  shape original_shape_SphericalDisproportion
## V6  shape             original_shape_Sphericity
## V7  shape            original_shape_SurfaceArea

To improve the esthetic of the dataframe, we note:

5.1 Plot the distribution of Goups and features

groups %>%
  group_by(Groups, Features) %>%
  summarise(n_Features = n()) %>%
  ggplot() +
  aes(x = Groups, y = n_Features, color = Features ) +
  geom_col() +
  theme(legend.position = "none") +
  ggtitle("Number of Features by Group")

5.1.1 Set New Colnames of radiomics

new_colnames_radiomics <- groups %>%
  mutate(Features = stringr::str_remove(Features,"original_")) %>%
  pull(Features)

new_colnames_radiomics %>% head()
## [1] "shape_Compactness1"           "shape_Compactness2"          
## [3] "shape_Maximum3DDiameter"      "shape_SphericalDisproportion"
## [5] "shape_Sphericity"             "shape_SurfaceArea"

5.1.2 Get new radiomics style

old_names <- colnames(radiomics)
new_names <- c("PatientID", new_colnames_radiomics)

new_radiomics <- radiomics[-1:-3,] %>%
  rename_at(vars(old_names), ~ new_names) %>%
  mutate_if(is.character, as.numeric) #%>%
#as.matrix()


head(new_radiomics)[,1:8]
##   PatientID shape_Compactness1 shape_Compactness2 shape_Maximum3DDiameter
## 1       202         0.02781503          0.2748916                48.55924
## 2       371         0.02301549          0.1882100                75.70337
## 3       246         0.02734811          0.2657399                70.43437
## 4       240         0.02681111          0.2554064                46.81880
## 5       284         0.02369124          0.1994242                53.79591
## 6       348         0.03098136          0.3410383                63.74951
##   shape_SphericalDisproportion shape_Sphericity shape_SurfaceArea
## 1                     1.537964        0.6502103          5431.333
## 2                     1.744961        0.5730787         10369.569
## 3                     1.555420        0.6429131         10558.819
## 4                     1.576120        0.6344693          4221.412
## 5                     1.711620        0.5842418          5295.900
## 6                     1.431305        0.6986630          8493.134
##   shape_SurfaceVolumeRatio
## 1                0.2752278
## 2                0.2407268
## 3                0.2007660
## 4                0.3238780
## 5                0.3272407
## 6                0.1976017

5.2 Glimpse correlation between features (default order)

M <- cor(new_radiomics[-1])
#corrplot(M,  method = "circle")
corrplot.mixed(M, tl.col="black", tl.pos = "lt")

5.2.1 Set the first principal component order of the features

corrplot.mixed(M, tl.col="black", tl.pos = "lt", order = "FPC")

5.2.2 Set the hierarchical clustering order of the features

corrplot.mixed(M, tl.col="black", tl.pos = "lt", order = "hclust")

  • We can return to these heatmap when we predict the most importante features using modeling.

5.3 Explore Clinical data for Train

p1 <- clinical %>%
  group_by(Histology = stringi::stri_trans_totitle(Histology)) %>% # case insensitive of adenocarcinoma and Adenocarcinoma
  group_by(Histology) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = Histology, y = Count, fill= Histology) +
  geom_col()+
  geom_text(aes(label = percent(Count/sum(Count))), vjust = -0.5)+
  geom_text(aes(label = Count), vjust = -2) +
  theme(axis.text.x = element_text(color="black",size=10,hjust=.5,vjust=.5, angle=5))


p2 <- ggplot(data=clinical[!is.na(clinical$age),]) +
  aes(x= age) +
  geom_histogram(fill="blue", bins = 60)  +
  geom_vline(xintercept = c(65,70, 72 ), color = "red")
#coord_flip()

p3 <- clinical %>%
  mutate(Nstage = as.factor(Nstage)) %>%
  group_by(Mstage, Nstage, Tstage) %>%
  summarise(Count = n()) %>%
  ggplot() +
  aes(x = Tstage, y = Count, color = Nstage) +
  facet_grid(Mstage~ .) +
  geom_point(size=4, alpha = 0.8)

p4 <- clinical %>%
  group_by(SourceDataset) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = "", y = Count, fill = SourceDataset) +
  geom_bar(width = 1, stat = "identity") +
  coord_polar("y", start=0) +
  theme(legend.position = "top")

grid.arrange(p1,p2,p3,p4, layout_matrix = rbind(c(1),c(2, 3, 4)), nrow = 2)

  • The most frequente cases is Adenocarcinoma followed by Aquamous Cell Carcinoma.

  • NOS: not otherwise specified

  • It seems NOS and Nsclc Nos corespond to the same category

  • Nan is not available ?

  • The density plot shows that the must frequent cases are 65, 70, 72 years old.

_ The most frequent Nstage class is also 0, followed by 2, 3, and 1.

  • The third plots shows that the most cases are in Mstage == 0. We can focus only in this class.

  • There are two sources of dataset.

5.4 Explore output_train and output_test

output_train <- fread("output_train.csv")
output_test <- fread("output_test.csv")
head(output_train)
##    PatientID SurvivalTime Event
## 1:       202         1378     0
## 2:       371          379     1
## 3:       246          573     1
## 4:       240          959     0
## 5:       284         2119     0
## 6:       348          706     1
head(output_test)
##    PatientID SurvivalTime Event
## 1:        13     788.4177   NaN
## 2:       155     427.6501   NaN
## 3:       404     173.5872   NaN
## 4:       407     389.8780   NaN
## 5:         9    1580.7672   NaN
## 6:        49     472.5234   NaN
  • The goal is the fill Event variable in output_test by 0 or 1.

6 Preprocessing of Train and Test dataset

The output of this section is to clean and unify variables and merge clinical, radimoics, and output_train dataset.

6.1 Train wrangling

# Convert character variables to numeric 
new_clinical <- clinical %>%
                mutate(Histology = stringi::stri_trans_totitle(Histology)) %>% 
                mutate_if(is.character, as.factor) %>%
                mutate_if(is.factor, as.numeric) 
                #mutate(Histo = as.numeric(as.factor(Histology))) %>%
                #mutate(Source = as.numeric(as.factor(SourceDataset))) %>%
                #select(everything(), - Histology, -SourceDataset)

train_features <- new_clinical %>%
  mutate_if(is.character, as.factor) %>%
  left_join(y = output_train, by = "PatientID") %>%
  left_join(y = new_radiomics, by = "PatientID") %>%
  select(PatientID, Event, everything()) %>%
  setDT()


fwrite(train_features, "train_features.csv")
train_features[,1:10] %>% head()
##    PatientID Event Histology Mstage Nstage SourceDataset Tstage     age
## 1:       202     0         1      0      0             2      2 66.0000
## 2:       371     1         2      0      2             1      4 64.5722
## 3:       246     1         6      0      3             1      2 66.0452
## 4:       240     0         4      0      2             1      3 59.3566
## 5:       284     0         6      0      3             1      4 71.0554
## 6:       348     1         6      0      2             1      2 65.0212
##    SurvivalTime shape_Compactness1
## 1:         1378         0.02781503
## 2:          379         0.02301549
## 3:          573         0.02734811
## 4:          959         0.02681111
## 5:         2119         0.02369124
## 6:          706         0.03098136

6.1.1 Explore missing value in train

require(DataExplorer)
DataExplorer::plot_missing(train_features)

  • There are 18 missing age from 300.

6.2 Test wrangling

radiomics_test <- fread("test/features/radiomics.csv", quote = "")
clinical_test <- fread("test/features/clinical_data.csv")
output_test <- fread("output_test.csv")

6.2.1 Transform radiomics test dataset

groups_test <- radiomics_test[1:2,-1] %>%
  t() %>%
  as.data.frame() %>%
  rename("Groups" = V1, "Features" = V2)

new_colnames_radiomics_test <- groups_test %>%
  mutate(Features = stringr::str_remove(Features,"original_")) %>%
  pull(Features)

old_names_test <- colnames(radiomics_test)
new_names_test <- c("PatientID", new_colnames_radiomics_test)

new_radiomics_test <- radiomics_test[-1:-3,] %>%
  rename_at(vars(old_names_test), ~ new_names_test) %>%
  mutate_if(is.character, as.numeric) #%>%
#as.matrix()


head(new_radiomics_test)[,1:8]
##   PatientID shape_Compactness1 shape_Compactness2 shape_Maximum3DDiameter
## 1        13         0.02888522         0.29645143               106.90182
## 2       155         0.03194837         0.36266005                18.81489
## 3       404         0.01599883         0.09094503               105.08092
## 4       407         0.03135766         0.34937318                46.96807
## 5         9         0.01781454         0.11275905                56.54202
## 6        49         0.03816202         0.51744596                20.12461
##   shape_SphericalDisproportion shape_Sphericity shape_SurfaceArea
## 1                     1.499738        0.6667830        29085.5414
## 2                     1.402276        0.7131265          629.4436
## 3                     2.223687        0.4497036        12509.2654
## 4                     1.419832        0.7043089         4067.6574
## 5                     2.069901        0.4831149         7093.3657
## 6                     1.245599        0.8028264          844.2344
##   shape_SurfaceVolumeRatio
## 1                0.1145278
## 2                0.7038788
## 3                0.3152977
## 4                0.2821040
## 5                0.3760316
## 6                0.5088176

6.2.2 Transform clinical test dataset

new_clinical_test <- clinical_test %>%
                mutate(Histology = stringi::stri_trans_totitle(Histology)) %>% 
                mutate_if(is.character, as.factor) %>%
                mutate_if(is.factor, as.numeric) 

6.2.3 Explore clinical data for Test

p1 <- new_clinical_test %>%
  group_by(Histology = stringi::stri_trans_totitle(Histology)) %>% # case insensitive of adenocarcinoma and Adenocarcinoma
  group_by(Histology) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = Histology, y = Count, fill= Histology) +
  geom_col()+
  geom_text(aes(label = percent(Count/sum(Count))), vjust = -0.5)+
  geom_text(aes(label = Count), vjust = -2) +
  theme(axis.text.x = element_text(color="black",size=10,hjust=.5,vjust=.5, angle=5))


p2 <- ggplot(data=new_clinical_test[!is.na(new_clinical_test$age),]) +
  aes(x= age) +
  geom_histogram(fill="blue", bins = 60)  +
  geom_vline(xintercept = c(65,70, 72 ), color = "red")
#coord_flip()

p3 <- new_clinical_test %>%
  mutate(Nstage = as.factor(Nstage)) %>%
  group_by(Mstage, Nstage, Tstage) %>%
  summarise(Count = n()) %>%
  ggplot() +
  aes(x = Tstage, y = Count, color = Nstage) +
  facet_grid(Mstage~ .) +
  geom_point(size=4, alpha = 0.8)

p4 <- new_clinical_test %>%
  mutate(SourceDataset = as.factor(SourceDataset)) %>%
  group_by(SourceDataset) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = "", y = Count, fill = SourceDataset) +
  geom_bar(width = 1, stat = "identity") +
  coord_polar("y", start=0) +
  theme(legend.position = "top")


grid.arrange(p1,p2,p3,p4, layout_matrix = rbind(c(1),c(2, 3, 4)), nrow = 2)

  • There is fourth class for Nstage in test data taht is not available in train.
  • The average age in test seems tobe younger in test data than in train.
  • The Histology seems to have the same distribution

6.2.4 Merge clinical, radiomics, and output_test dataset

test_features <- new_clinical_test %>%
  mutate_if(is.character, as.factor) %>%
  left_join(y = output_test, by = "PatientID") %>%
  left_join(y = new_radiomics_test, by = "PatientID") %>%
  select(PatientID, Event, everything()) %>%
  setDT() # convert to data.table


fwrite(test_features, "test_features.csv")
test_features[,1:10] %>% head()
##    PatientID Event Histology Mstage Nstage SourceDataset Tstage     age
## 1:        13   NaN         4      0      0             1      4 44.3970
## 2:       155   NaN         1      0      3             1      1 63.3183
## 3:       404   NaN         2      0      2             1      2 64.7255
## 4:       407   NaN         4      0      0             1      2 65.3635
## 5:         9   NaN         1      0      0             2      2 50.0000
## 6:        49   NaN         6      0      0             1      2 86.1410
##    SurvivalTime shape_Compactness1
## 1:     788.4177         0.02888522
## 2:     427.6501         0.03194837
## 3:     173.5872         0.01599883
## 4:     389.8780         0.03135766
## 5:    1580.7672         0.01781454
## 6:     472.5234         0.03816202

6.2.4.1 Explore missing value in test

library(DataExplorer)
DataExplorer::plot_missing(test_features)

  • There are 4 missing age from 125.
#refresh memoru
rm(list = ls())
invisible(gc())

7 Pull Radiomics, Clinical data and the best Masks

8 R Xgboost modeling with Radiomics, Clinical and Masks

8.1 Scaling Train and Test dataset

trainremoveCols <- c('PatientID','Event')
testremoveCols <- c('PatientID', 'Event')

Event <- train$Event
PatientID <- test$PatientID

train[,(trainremoveCols) := NULL]
test[,(testremoveCols) := NULL]

# Do scaling
dt <- rbind(train, test)
scale.cols <- colnames(dt)
dt[, (scale.cols) := lapply(.SD, scale), .SDcols = scale.cols]
train <- cbind(Event, head(dt,nrow(train)))
test  <- cbind(PatientID, tail(dt, nrow(test)))
rm(dt)
invisible(gc())

8.2 Split Train dataset into Train & Valid sets

require(rsample)

set.seed(100)
train_valid_split <- rsample::initial_split(train, prop = 0.8)
train_valid_split
## <241/59/300>
  • We can retrieve our training and testing sets using training() and testing() functions.
# Retrieve train and test sets
train_8 <- rsample::training(train_valid_split)
valid_2  <- rsample::testing(train_valid_split)
train_8[1:10, 1:10]
##     Event  Histology     Mstage     Nstage SourceDataset     Tstage        age
##  1:     0 -1.0235305 -0.1207812 -0.8392371     1.4175490 -0.1386218 -0.2504355
##  2:     1 -0.5217999 -0.1207812  0.8392371    -0.7037831  1.7024495 -0.3972837
##  3:     1  1.4851227 -0.1207812  1.6784742    -0.7037831 -0.1386218 -0.2457867
##  4:     0  1.4851227 -0.1207812  1.6784742    -0.7037831  1.7024495  0.2695086
##  5:     1  1.4851227 -0.1207812  0.8392371    -0.7037831 -0.1386218 -0.3511044
##  6:     1 -1.0235305 -0.1207812 -0.8392371     1.4175490 -1.0591575  0.1609615
##  7:     1 -1.0235305 -0.1207812 -0.8392371    -0.7037831  1.7024495  0.6186818
##  8:     1 -1.0235305 -0.1207812  0.8392371    -0.7037831  1.7024495 -1.5788159
##  9:     0 -0.5217999 -0.1207812  0.8392371    -0.7037831 -0.1386218 -2.0372357
## 10:     1  1.4851227 -0.1207812  0.8392371     1.4175490 -0.1386218  1.1894541
##     SurvivalTime shape_Compactness1 shape_Compactness2
##  1:    0.7487397          0.3273910          0.2206572
##  2:   -0.7068626         -0.4508739         -0.5451008
##  3:   -0.4241931          0.2516768          0.1398098
##  4:    1.8284207         -0.3412982         -0.4460329
##  5:   -0.2304042          0.8408228          0.8050072
##  6:    0.7356262         -0.8618943         -0.8911634
##  7:   -0.9720474          0.3619137          0.2579747
##  8:   -0.9735045         -1.1924952         -1.1402475
##  9:    0.3815607          0.9177119          0.8979347
## 10:   -0.7345467         -0.0934834         -0.2114097

8.3 Format train and test to DMatrix

require(Matrix)
require(xgboost)

# the option na.pass avoids missing value in age column
options(na.action='na.pass')
train_8_sparse <- sparse.model.matrix(Event ~., data=train_8)
dtrain_8 <- xgb.DMatrix(data=train_8_sparse, label = train_8$Event)

options(na.action='na.pass')
valid_2_sparse <- sparse.model.matrix(Event ~., data=valid_2)
dvalid_2 <- xgb.DMatrix(data=valid_2_sparse, label = valid_2$Event)

8.4 Optimize features with Cross validation

Here, we can see after how many rounds, we achieved the smallest test error.

params <- list(booster = "gbtree",
              tree_method = "auto",
              objective = "binary:logistic",
              eval_metric = "auc",         #  for Binary classification error rate
              max_depth = 2,        # 6 makes training heavy, there is no correlation between features #1 is not better
              eta = 0.03,                     # learning rate
              subsample = 0.8,              # prevent overfitting
              colsample_bytree = 0.1         # specify the fraction of columns to be subsampled. # 0.5 is not better
             )


tme <- Sys.time()
cv_model <- xgb.cv(params = params,
                   data = dtrain_8,
                   nthread = parallel::detectCores(all.tests = FALSE, logical = TRUE),  #2,
                   nrounds = 25000,
                   verbose = TRUE,
                   nfold = 7,
                   print_every_n = 500,
                   early_stopping_rounds = 1000,
                   maximize = TRUE,
                   prediction = TRUE) # prediction of cv folds
## [1]  train-auc:0.718051+0.043889 test-auc:0.619140+0.061644 
## Multiple eval metrics are present. Will use test_auc for early stopping.
## Will train until test_auc hasn't improved in 1000 rounds.
## 
## [501]    train-auc:0.999879+0.000097 test-auc:0.749731+0.075059 
## [1001]   train-auc:1.000000+0.000000 test-auc:0.753281+0.076136 
## [1501]   train-auc:1.000000+0.000000 test-auc:0.761041+0.071028 
## [2001]   train-auc:1.000000+0.000000 test-auc:0.757107+0.069419 
## [2501]   train-auc:1.000000+0.000000 test-auc:0.756282+0.072293 
## Stopping. Best iteration:
## [1561]   train-auc:1.000000+0.000000 test-auc:0.762555+0.071525
Sys.time() - tme
## Time difference of 15.72488 mins

8.5 Train the model

watchlist <- list(train = dtrain_8, eval = dvalid_2)
tme <- Sys.time()
xgboost_tree <- xgb.train(data = dtrain_8, 
                         params = params,
                         watchlist = watchlist,
                         nrounds = cv_model$best_iteration, # more than 12000 ~0.897
                         print_every_n = 500,
                         verbose = TRUE)
## [1]  train-auc:0.688073  eval-auc:0.543103 
## [501]    train-auc:0.999444  eval-auc:0.842529 
## [1001]   train-auc:1.000000  eval-auc:0.865517 
## [1501]   train-auc:1.000000  eval-auc:0.864368 
## [1561]   train-auc:1.000000  eval-auc:0.864368
Sys.time() - tme
## Time difference of 1.606111 mins

8.6 Predict valid_2 dataset

pred_valid <- predict(xgboost_tree, dvalid_2)

summary(pred_valid)
##     Min.  1st Qu.   Median     Mean  3rd Qu.     Max. 
## 0.002944 0.064929 0.499536 0.434123 0.738628 0.998919
  • We suppose that if Prob > 0.5 (Median), the Event is 1, else 0

8.7 Transform propability to binary classification

table(as.numeric(pred_valid >= 0.63))
## 
##  0  1 
## 34 25

8.8 Confusion matrix for Tree model and performances

# data.frame(prediction = as.numeric(pred_bin),
#          label = as.numeric(valid_2$Event)) %>%
#          count(prediction, label)

pred_valid_bin <- as.factor(ifelse(pred_valid > 0.63,1,0))

valid_2_event <- as.factor(valid_2$Event)

caret::confusionMatrix(pred_valid_bin, valid_2_event)
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction  0  1
##          0 25  9
##          1  4 21
##                                           
##                Accuracy : 0.7797          
##                  95% CI : (0.6527, 0.8771)
##     No Information Rate : 0.5085          
##     P-Value [Acc > NIR] : 1.683e-05       
##                                           
##                   Kappa : 0.5605          
##                                           
##  Mcnemar's Test P-Value : 0.2673          
##                                           
##             Sensitivity : 0.8621          
##             Specificity : 0.7000          
##          Pos Pred Value : 0.7353          
##          Neg Pred Value : 0.8400          
##              Prevalence : 0.4915          
##          Detection Rate : 0.4237          
##    Detection Prevalence : 0.5763          
##       Balanced Accuracy : 0.7810          
##                                           
##        'Positive' Class : 0               
## 
pred_valid_error <- ROCR::prediction(pred_valid , valid_2$Event)
auc.valid <- ROCR::performance(pred_valid_error, "auc")
print("AUC: "); as.numeric(auc.valid@y.values)
## [1] "AUC: "
## [1] 0.8643678

8.9 Extract the most important features from tree xgboost model

8.9.1 List the most important features

features <- colnames(train_8)
importance_matrix_tree <- xgb.importance(features, model = xgboost_tree)
importance_matrix_tree
##                           Feature         Gain        Cover    Frequency
##   1:                 SurvivalTime 1.518797e-01 6.403812e-02 0.0451349512
##   2:             glcm_Correlation 2.110504e-02 2.498131e-02 0.0242685416
##   3:                          age 2.073723e-02 2.646743e-02 0.0215468360
##   4:      glcm_MaximumProbability 2.024262e-02 1.942644e-02 0.0176910864
##   5: glrlm_GrayLevelNonUniformity 1.741357e-02 1.557860e-02 0.0151961896
##  ---                                                                    
## 884:                        V3718 1.371769e-05 3.366216e-05 0.0002268088
## 885:                        V3809 9.926757e-06 3.106980e-05 0.0002268088
## 886:                        V6129 8.795324e-06 3.016645e-05 0.0002268088
## 887:                        V4196 2.153746e-06 3.204075e-05 0.0002268088
## 888:                        V3894 8.746036e-07 4.291979e-05 0.0002268088
  • Survival Time is the most important feature, followed by age, and 8 texture description.

8.9.2 Plot the most important features (Tree model)

library(Ckmeans.1d.dp)
xgb.ggplot.importance(importance_matrix_tree[1:30,]) +
ggplot2::theme_minimal()

8.10 Test Xgboost Prediction

8.10.1 Load test data and format to DMatrix

test_sparse <- sparse.model.matrix(PatientID ~., data=test)
 dtest <- xgb.DMatrix(data=test_sparse, label = test$PatientID)
pred_test <- predict(xgboost_tree, dtest)

pred_test_bin <- as.numeric(pred_test >= 0.63)


print("ratio of predicted Events in test datset: "); table(pred_test_bin)
## [1] "ratio of predicted Events in test datset: "
## pred_test_bin
##  0  1 
## 73 52
print("ratio of Events in train datset: "); table(train$Event)
## [1] "ratio of Events in train datset: "
## 
##   0   1 
## 138 162

8.11 submission

pred_test_bin <- as.factor(ifelse(pred_test > 0.63,1,0))

output_test <- fread("output_test.csv")

pred <- data.frame(
  PatientID = output_test$PatientID,
  Event = pred_test_bin
)



submission <- output_test %>%
  select(PatientID, SurvivalTime) %>%
  left_join(pred, by = "PatientID")

fwrite(submission, "submission_fea_img_r.csv")

submission %>% head(10)
##    PatientID SurvivalTime Event
## 1         13     788.4177     0
## 2        155     427.6501     1
## 3        404     173.5872     1
## 4        407     389.8780     1
## 5          9    1580.7672     0
## 6         49     472.5234     1
## 7         55    1970.9725     0
## 8        200     530.4248     0
## 9        170    1067.4630     0
## 10       387     378.3248     1
rm(list = ls())
invisible(gc())

9 Python xgboost Prediction (Python, Radiomics, Clinical, Masks datasets)

import matplotlib
import matplotlib.pyplot as plt
import pandas as pd
import numpy as np
import seaborn as sns
from sklearn.model_selection import train_test_split
import pandas as pd

test = pd.read_csv("test_fea_img.csv")
train = pd.read_csv("train_fea_img.csv")
train.head()
##    PatientID  Event  Histology  Mstage  ...  V8461  V8462  V8463  V8464
## 0        202      0          1       0  ...      0      0      0      0
## 1        371      1          2       0  ...      0      0      0      0
## 2        246      1          6       0  ...      0      0      0      0
## 3        240      0          4       0  ...      0      0      0      0
## 4        284      0          6       0  ...      0      0      0      0
## 
## [5 rows x 8526 columns]
# impute age to median
train['age'] = train['age'].fillna(train['age'].mean())
test['age'] = test['age'].fillna(test['age'].mean())

test.head()
##    PatientID  Event  Histology  Mstage  ...  V8461  V8462  V8463  V8464
## 0         13    NaN          4       0  ...      0      0      0      0
## 1        155    NaN          1       0  ...      0      0      0      0
## 2        404    NaN          2       0  ...      0      0      0      0
## 3        407    NaN          4       0  ...      0      0      0      0
## 4          9    NaN          1       0  ...      0      0      0      0
## 
## [5 rows x 8526 columns]
# Set weed
SEED = 1423
train = train.drop('PatientID', axis=1)
test = test.drop(['PatientID', 'Event'], axis = 1)

def prepare_data_for_model(raw_dataframe, target_columns, drop_first = True, make_na_col = False):
    # dummy all categorical fields 
    dataframe_dummy = pd.get_dummies(raw_dataframe, columns=target_columns, 
                                     drop_first=drop_first, 
                                     dummy_na=make_na_col)
    return (dataframe_dummy)

# create dummy features 
train_xgboost = prepare_data_for_model(train, target_columns=['Histology', 'Tstage']) #, 'Nstage' : 3 classes
train_xgboost = train_xgboost.dropna() 

# create dummy features for test
test_xgboost = prepare_data_for_model(test, target_columns=['Histology', 'Tstage']) #, 'Nstage' : 4 classes
test_xgboost = test_xgboost.dropna() 

train_xgboost.head()
##    Event  Mstage  Nstage  SourceDataset  ...  Tstage_2  Tstage_3  Tstage_4  Tstage_5
## 0      0       0       0              2  ...         1         0         0         0
## 1      1       0       2              1  ...         0         0         1         0
## 2      1       0       3              1  ...         1         0         0         0
## 3      0       0       2              1  ...         0         1         0         0
## 4      0       0       3              1  ...         0         0         1         0
## 
## [5 rows x 8532 columns]
test_xgboost.head()
##    Mstage  Nstage  SourceDataset  ...  Tstage_3  Tstage_4  Tstage_5
## 0       0       0              1  ...         0         1         0
## 1       0       3              1  ...         0         0         0
## 2       0       2              1  ...         0         0         0
## 3       0       0              1  ...         0         0         0
## 4       0       0              2  ...         0         0         0
## 
## [5 rows x 8531 columns]
# split data into train and test portions and model
features = [feat for feat in list(train_xgboost) if feat != 'Event']
X_train, X_valid, y_train, y_valid = train_test_split(train_xgboost[features], 
                                                 train_xgboost[['Event']], 
                                                test_size=0.3, 
                                                 random_state=SEED)
 


import xgboost  as xgb
xgb_params = {
    'max_depth':3, 
    'eta':0.01, 
    'silent':0, 
    'eval_metric':'auc',
    'subsample': 0.8,
    'colsample_bytree': 0.8,
    'objective':'binary:logistic',
    'seed' : 1423
}

dtrain = xgb.DMatrix(X_train, y_train, feature_names=X_train.columns.values)
dvalid = xgb.DMatrix(X_valid, y_valid, feature_names=X_valid.columns.values)


evals = [(dtrain,'train'),(dvalid,'eval')]
xgb_model_fea = xgb.train ( params = xgb_params,
              dtrain = dtrain,
              num_boost_round = 5000,
              verbose_eval=200, 
              early_stopping_rounds = 500,
              evals=evals,
              maximize = True)
## [0]  train-auc:0.865973  eval-auc:0.842391
## Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.
## 
## Will train until eval-auc hasn't improved in 500 rounds.
## [200]    train-auc:0.986367  eval-auc:0.837945
## [400]    train-auc:0.999171  eval-auc:0.839427
## Stopping. Best iteration:
## [3]  train-auc:0.928519  eval-auc:0.845109
# get dataframe version of important feature for model 
xgb_fea_imp=pd.DataFrame(list(xgb_model_fea.get_fscore().items()),
columns=['feature','importance']).sort_values('importance', ascending=False)
xgb_fea_imp.head(10)
##                                feature  importance
## 0                         SurvivalTime         490
## 3              glcm_MaximumProbability         101
## 2                                  age          99
## 6                    glcm_ClusterShade          81
## 19   glrlm_LongRunLowGrayLevelEmphasis          67
## 32                    glcm_Correlation          61
## 8   glrlm_ShortRunLowGrayLevelEmphasis          59
## 70                   firstorder_Median          57
## 33              glcm_ClusterProminence          56
## 13       glrlm_LowGrayLevelRunEmphasis          52
# Confusion matrix
dval_predictions = xgb_model_fea.predict(dvalid)

from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_valid, [1 if p > 0.5 else 0 for p in dval_predictions])

plt.figure(figsize = (6,4))
plt.ticklabel_format(style='plain', axis='y', useOffset=False)
sns.set(font_scale=1.4)
sns.heatmap(cm, annot=True, annot_kws={"size": 16}) 
plt.show()

# evaluate predictions
from sklearn.metrics import accuracy_score

dval_predictions[dval_predictions > 0.5] = 1
dval_predictions[dval_predictions <= 0.5] = 0

predictions = [round(value) for value in dval_predictions]
accuracy = accuracy_score(y_valid, predictions)
print("Accuracy: %.2f%%" % (accuracy * 100.0))
## Accuracy: 72.22%

9.1 Prediction with Python, features, Masks

dtest = xgb.DMatrix(test_xgboost,  feature_names = X_train.columns.values)

dtest_predictions = xgb_model_fea.predict(dtest)

dtest_predictions[dtest_predictions > 0.5] = 1
dtest_predictions[dtest_predictions <= 0.5] = 0

output_test = pd.read_csv('output_test.csv',  index_col = False)

output_test['Event'] = dtest_predictions

# submission_fea_img_p['Event'] = np.where(submission_fea_img_p['Event'] >0.5, 1, 0)

output_test.to_csv("submission_fea_img_p.csv", index=False)


output_test.head()
##    PatientID  SurvivalTime  Event
## 0         13    788.417673    0.0
## 1        155    427.650092    1.0
## 2        404    173.587222    1.0
## 3        407    389.877973    1.0
## 4          9   1580.767244    0.0

10 Random Forest Survival model (ranger package)

This method will give us an outcome probability over a time continuum (flipping the non-event to event probability)

Let’s take a quick look at the time period range in the training portion of our data set:

test <- fread("test_features.csv" )
train <- fread("train_features.csv")


# # reset train and test dataset
# train <-new_clinical %>%
#   mutate_if(is.character, as.factor) %>%
#   left_join(y = output_train, by = "PatientID") %>%
#   left_join(y = new_radiomics, by = "PatientID") %>%
#   select(PatientID, Event, everything()) %>%
#   setDT() # convert to data.table
# 
# 
#  test <- new_clinical_test %>%
#   mutate_if(is.character, as.factor) %>%
#   left_join(y = output_test, by = "PatientID") %>%
#   left_join(y = new_radiomics_test, by = "PatientID") %>%
#   select(PatientID, Event, everything()) %>%
#   setDT() # convert to data.table

plot(sort(train$SurvivalTime), pch='.', type='o',
     col='blue', lwd=2 ,
     main = 'Survival Days of Lung Cancer')

10.1 Dealing with missing Age

10.1.1 Check the data for missing values.

sapply(train, function(x) sum(is.na(x)))
##                           PatientID                               Event 
##                                   0                                   0 
##                           Histology                              Mstage 
##                                   0                                   0 
##                              Nstage                       SourceDataset 
##                                   0                                   0 
##                              Tstage                                 age 
##                                   0                                  16 
##                        SurvivalTime                  shape_Compactness1 
##                                   0                                   0 
##                  shape_Compactness2             shape_Maximum3DDiameter 
##                                   0                                   0 
##        shape_SphericalDisproportion                    shape_Sphericity 
##                                   0                                   0 
##                   shape_SurfaceArea            shape_SurfaceVolumeRatio 
##                                   0                                   0 
##                   shape_VoxelVolume                   firstorder_Energy 
##                                   0                                   0 
##                  firstorder_Entropy                 firstorder_Kurtosis 
##                                   0                                   0 
##                  firstorder_Maximum                     firstorder_Mean 
##                                   0                                   0 
##    firstorder_MeanAbsoluteDeviation                   firstorder_Median 
##                                   0                                   0 
##                  firstorder_Minimum                    firstorder_Range 
##                                   0                                   0 
##          firstorder_RootMeanSquared                 firstorder_Skewness 
##                                   0                                   0 
##        firstorder_StandardDeviation               firstorder_Uniformity 
##                                   0                                   0 
##                 firstorder_Variance                glcm_Autocorrelation 
##                                   0                                   0 
##              glcm_ClusterProminence                   glcm_ClusterShade 
##                                   0                                   0 
##                glcm_ClusterTendency                       glcm_Contrast 
##                                   0                                   0 
##                    glcm_Correlation              glcm_DifferenceEntropy 
##                                   0                                   0 
##              glcm_DifferenceAverage                    glcm_JointEnergy 
##                                   0                                   0 
##                   glcm_JointEntropy                             glcm_Id 
##                                   0                                   0 
##                            glcm_Idm                           glcm_Imc1 
##                                   0                                   0 
##                           glcm_Imc2                           glcm_Idmn 
##                                   0                                   0 
##                            glcm_Idn                glcm_InverseVariance 
##                                   0                                   0 
##             glcm_MaximumProbability                     glcm_SumAverage 
##                                   0                                   0 
##                     glcm_SumEntropy              glrlm_ShortRunEmphasis 
##                                   0                                   0 
##               glrlm_LongRunEmphasis        glrlm_GrayLevelNonUniformity 
##                                   0                                   0 
##        glrlm_RunLengthNonUniformity                 glrlm_RunPercentage 
##                                   0                                   0 
##       glrlm_LowGrayLevelRunEmphasis      glrlm_HighGrayLevelRunEmphasis 
##                                   0                                   0 
##  glrlm_ShortRunLowGrayLevelEmphasis glrlm_ShortRunHighGrayLevelEmphasis 
##                                   0                                   0 
##   glrlm_LongRunLowGrayLevelEmphasis  glrlm_LongRunHighGrayLevelEmphasis 
##                                   0                                   0

10.1.2 Imputation processing for train data

require(mice)
init = mice(train, maxit=0)
## Warning: Number of logged events: 1
meth = init$method
predM = init$predictorMatrix
  • We may not want to use a certain variable as predictors. For example, the PatitenID variable does not have any predictive value.
predM[, c("PatientID")] <- 0
  • If we want to skip a variable from imputation use the code below. This variable will be used for prediction.
#colnames(train)[train[, !names(train) %in% c("PatientID")]]

meth[c("PatientID")]=""
  • Now let specify the methods for imputing the missing values. There are specific methods for continues, binary and ordinal variables.
meth[c("age")]="cart"  # pmm (Predictive Mean Matching suitable for numeric variables )

# pmm generate error. it is seems working with cart
# (https://stackoverflow.com/questions/48355250/
#do-imputation-in-r-when-mice-returns-error-that-system-is-computationally-singu)
  • Now it is time to run the multiple (m=5) imputation.
set.seed(103)
imputed = mice(train, method=meth, predictorMatrix=predM, m=5)
## 
##  iter imp variable
##   1   1  age
##   1   2  age
##   1   3  age
##   1   4  age
##   1   5  age
##   2   1  age
##   2   2  age
##   2   3  age
##   2   4  age
##   2   5  age
##   3   1  age
##   3   2  age
##   3   3  age
##   3   4  age
##   3   5  age
##   4   1  age
##   4   2  age
##   4   3  age
##   4   4  age
##   4   5  age
##   5   1  age
##   5   2  age
##   5   3  age
##   5   4  age
##   5   5  age
## Warning: Number of logged events: 25
  • Create a dataset after imputation.
imputed <- complete(imputed)
  • Check for missings in the imputed dataset.
sapply(imputed, function(x) sum(is.na(x)))
##                           PatientID                               Event 
##                                   0                                   0 
##                           Histology                              Mstage 
##                                   0                                   0 
##                              Nstage                       SourceDataset 
##                                   0                                   0 
##                              Tstage                                 age 
##                                   0                                   0 
##                        SurvivalTime                  shape_Compactness1 
##                                   0                                   0 
##                  shape_Compactness2             shape_Maximum3DDiameter 
##                                   0                                   0 
##        shape_SphericalDisproportion                    shape_Sphericity 
##                                   0                                   0 
##                   shape_SurfaceArea            shape_SurfaceVolumeRatio 
##                                   0                                   0 
##                   shape_VoxelVolume                   firstorder_Energy 
##                                   0                                   0 
##                  firstorder_Entropy                 firstorder_Kurtosis 
##                                   0                                   0 
##                  firstorder_Maximum                     firstorder_Mean 
##                                   0                                   0 
##    firstorder_MeanAbsoluteDeviation                   firstorder_Median 
##                                   0                                   0 
##                  firstorder_Minimum                    firstorder_Range 
##                                   0                                   0 
##          firstorder_RootMeanSquared                 firstorder_Skewness 
##                                   0                                   0 
##        firstorder_StandardDeviation               firstorder_Uniformity 
##                                   0                                   0 
##                 firstorder_Variance                glcm_Autocorrelation 
##                                   0                                   0 
##              glcm_ClusterProminence                   glcm_ClusterShade 
##                                   0                                   0 
##                glcm_ClusterTendency                       glcm_Contrast 
##                                   0                                   0 
##                    glcm_Correlation              glcm_DifferenceEntropy 
##                                   0                                   0 
##              glcm_DifferenceAverage                    glcm_JointEnergy 
##                                   0                                   0 
##                   glcm_JointEntropy                             glcm_Id 
##                                   0                                   0 
##                            glcm_Idm                           glcm_Imc1 
##                                   0                                   0 
##                           glcm_Imc2                           glcm_Idmn 
##                                   0                                   0 
##                            glcm_Idn                glcm_InverseVariance 
##                                   0                                   0 
##             glcm_MaximumProbability                     glcm_SumAverage 
##                                   0                                   0 
##                     glcm_SumEntropy              glrlm_ShortRunEmphasis 
##                                   0                                   0 
##               glrlm_LongRunEmphasis        glrlm_GrayLevelNonUniformity 
##                                   0                                   0 
##        glrlm_RunLengthNonUniformity                 glrlm_RunPercentage 
##                                   0                                   0 
##       glrlm_LowGrayLevelRunEmphasis      glrlm_HighGrayLevelRunEmphasis 
##                                   0                                   0 
##  glrlm_ShortRunLowGrayLevelEmphasis glrlm_ShortRunHighGrayLevelEmphasis 
##                                   0                                   0 
##   glrlm_LongRunLowGrayLevelEmphasis  glrlm_LongRunHighGrayLevelEmphasis 
##                                   0                                   0
  • Accuracy of imputation
print("train$age") ; summary(train$age)
## [1] "train$age"
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max.    NA's 
##   42.51   62.98   69.95   68.77   76.20   87.13      16
print("imputed$age") ; summary(imputed$age)
## [1] "imputed$age"
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   42.51   63.00   70.00   68.94   76.42   87.13
fwrite(imputed, "train_features_imputed.csv")
  • Well done :-)

10.1.3 Do the same imputation for test dataset

init = mice(test, maxit=0)
## Warning: Number of logged events: 2
meth = init$method
predM = init$predictorMatrix

predM[, c("PatientID")] <- 0
meth[c("PatientID")]=""
meth[c("age")]="cart"

set.seed(103)
imputed_test = mice(test, method=meth, predictorMatrix=predM, m=5)
## 
##  iter imp variable
##   1   1  age
##   1   2  age
##   1   3  age
##   1   4  age
##   1   5  age
##   2   1  age
##   2   2  age
##   2   3  age
##   2   4  age
##   2   5  age
##   3   1  age
##   3   2  age
##   3   3  age
##   3   4  age
##   3   5  age
##   4   1  age
##   4   2  age
##   4   3  age
##   4   4  age
##   4   5  age
##   5   1  age
##   5   2  age
##   5   3  age
##   5   4  age
##   5   5  age
## Warning: Number of logged events: 25
imputed_test <- complete(imputed_test)

fwrite(imputed_test, "test_features_imputed.csv")
imputed_test[1:10,1:10]
##    PatientID Event Histology Mstage Nstage SourceDataset Tstage     age
## 1         13    NA         4      0      0             1      4 44.3970
## 2        155    NA         1      0      3             1      1 63.3183
## 3        404    NA         2      0      2             1      2 64.7255
## 4        407    NA         4      0      0             1      2 65.3635
## 5          9    NA         1      0      0             2      2 50.0000
## 6         49    NA         6      0      0             1      2 86.1410
## 7         55    NA         3      0      0             1      1 75.2663
## 8        200    NA         3      0      0             1      1 85.4511
## 9        170    NA         2      0      3             1      1 69.8727
## 10       387    NA         2      0      3             1      2 52.8569
##    SurvivalTime shape_Compactness1
## 1      788.4177         0.02888522
## 2      427.6501         0.03194837
## 3      173.5872         0.01599883
## 4      389.8780         0.03135766
## 5     1580.7672         0.01781454
## 6      472.5234         0.03816202
## 7     1970.9725         0.03699879
## 8      530.4248         0.03373779
## 9     1067.4630         0.01929334
## 10     378.3248         0.02531470

10.2 Preprocessing Train and Test

In order to measure the AUC of each model we need to split the data randomly (with seed) into two equal parts:

set.seed(1234)
random_splits <- runif(nrow(imputed))
train_df_official <- imputed[random_splits < .5,]
dim(train_df_official)
## [1] 151  62
validate_df_official <- imputed[random_splits >= .5,]
dim(validate_df_official)
## [1] 149  62

In order to align the survival and the classification models, we will focus on the probability of reaching event over the 825 days. We tried the Median, mean, quantile but it seems not well as 825 days.

period_choice <- round(quantile(test$SurvivalTime)[[2]])
period_choice <- 825
  • We also need to create a classification-centric outcome variable. This will measure how many patients reached event or not within the chosen period. Here we look for a censor feature of 1 (i.e. the event happened) under the chosen period to set the outcome to 1, everything else is set to 0:
# classification data set
train_df_classificaiton  <- train_df_official

train_df_classificaiton$ReachedEvent <- ifelse((train_df_classificaiton$Event == 1 &
                                     train_df_classificaiton$SurvivalTime <= period_choice), 1, 0)

summary(train_df_classificaiton$ReachedEvent)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   0.000   0.000   0.000   0.457   1.000   1.000
validate_df_classification  <- validate_df_official

validate_df_classification$ReachedEvent <- ifelse((validate_df_classification$Event == 1 &
                                        validate_df_classification$SurvivalTime <= period_choice), 1, 0)

summary(validate_df_classification$ReachedEvent)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.0000  0.0000  0.0000  0.4228  1.0000  1.0000
  • Now we can easily get an AUC score on the probability of reaching event within our allotted period choice

10.3 Survival Model (ranger)

train_df_classificaiton <- as.data.table(train_df_classificaiton)
# omit PatientID  and S3:integer64 from variable importance, Event and SurvivalTime will be used in formula
var <- paste(colnames(train_df_classificaiton)[train_df_classificaiton[, !names(train_df_classificaiton) %in% c("PatientID",
                                                          "Event",
                                                          "SurvivalTime",
                                                          "ReachedEvent",
                                                          "firstorder_Energy")]], # first_orderEnergy is not double
              collapse ="+")

survival_formula <- formula(paste('Surv(', 'SurvivalTime', ',', 'Event', ') ~ ', var))


survival_formula
## Surv(SurvivalTime, Event) ~ Histology + Mstage + Nstage + SourceDataset + 
##     Tstage + age + shape_Compactness1 + shape_Compactness2 + 
##     shape_Maximum3DDiameter + shape_SphericalDisproportion + 
##     shape_Sphericity + shape_SurfaceArea + shape_SurfaceVolumeRatio + 
##     shape_VoxelVolume + firstorder_Entropy + firstorder_Kurtosis + 
##     firstorder_Maximum + firstorder_Mean + firstorder_MeanAbsoluteDeviation + 
##     firstorder_Median + firstorder_Minimum + firstorder_Range + 
##     firstorder_RootMeanSquared + firstorder_Skewness + firstorder_StandardDeviation + 
##     firstorder_Uniformity + firstorder_Variance + glcm_Autocorrelation + 
##     glcm_ClusterProminence + glcm_ClusterShade + glcm_ClusterTendency + 
##     glcm_Contrast + glcm_Correlation + glcm_DifferenceEntropy + 
##     glcm_DifferenceAverage + glcm_JointEnergy + glcm_JointEntropy + 
##     glcm_Id + glcm_Idm + glcm_Imc1 + glcm_Imc2 + glcm_Idmn + 
##     glcm_Idn + glcm_InverseVariance + glcm_MaximumProbability + 
##     glcm_SumAverage + glcm_SumEntropy + glrlm_ShortRunEmphasis + 
##     glrlm_LongRunEmphasis + glrlm_GrayLevelNonUniformity + glrlm_RunLengthNonUniformity + 
##     glrlm_RunPercentage + glrlm_LowGrayLevelRunEmphasis + glrlm_HighGrayLevelRunEmphasis + 
##     glrlm_ShortRunLowGrayLevelEmphasis + glrlm_ShortRunHighGrayLevelEmphasis + 
##     glrlm_LongRunLowGrayLevelEmphasis + glrlm_LongRunHighGrayLevelEmphasis
survival_model <- ranger(survival_formula,
               data = train_df_classificaiton,
               seed = 1234,
               importance = 'permutation',
               mtry = 2,
               verbose = TRUE,
               num.trees = 50,
               write.forest=TRUE)

# print out coefficients
sort(survival_model$variable.importance, decreasing = TRUE) %>% head(10) %>% names()
##  [1] "firstorder_Skewness"          "glcm_Idm"                    
##  [3] "glrlm_LongRunEmphasis"        "glcm_Idmn"                   
##  [5] "Tstage"                       "glrlm_ShortRunEmphasis"      
##  [7] "glcm_JointEnergy"             "glrlm_GrayLevelNonUniformity"
##  [9] "firstorder_Kurtosis"          "glcm_JointEntropy"
  • Once we have our survival_model model object, we can take a look at some probabilities of survival (this is just for illustrative purposes as we haven’t split our data set yet). Let’s look at two patients - row 1 and row 56:
plot(survival_model$unique.death.times, survival_model$survival[1,],
     type='l', col='orange', ylim=c(0.2,1))

lines(survival_model$unique.death.times,
      survival_model$survival[56,], col='blue')

  • The orange PatientID has more probability to survive.
print("Orange PatientID: "); imputed[1,c(2,7,8,9)]
## [1] "Orange PatientID: "
##   Event Tstage age SurvivalTime
## 1     0      2  66         1378
print("Blue PatientID: "); imputed[56,c(2,7,8,9)]
## [1] "Blue PatientID: "
##    Event Tstage     age SurvivalTime
## 56     1      4 62.4476          303

10.3.1 Scoring the Random Forest Survival Model

First we get the basic survival prediction using our validation split set and then we flip the probability of the period of choice and get the AUC score:

feature_names <- setdiff(names(train_df_classificaiton),
                         c('PatientID', 'ReachedEvent',
                           'SurvivalTime', 'Event'))


suvival_predictions <- predict(survival_model,
                                validate_df_official[, feature_names])

roc(response = validate_df_classification$ReachedEvent,
    predictor = 1 - suvival_predictions$survival[,which(suvival_predictions$unique.death.times
                                                        == period_choice)])
## Setting levels: control = 0, case = 1
## Setting direction: controls < cases
## 
## Call:
## roc.default(response = validate_df_classification$ReachedEvent,     predictor = 1 - suvival_predictions$survival[, which(suvival_predictions$unique.death.times ==         period_choice)])
## 
## Data: 1 - suvival_predictions$survival[, which(suvival_predictions$unique.death.times == period_choice)] in 86 controls (validate_df_classification$ReachedEvent 0) < 63 cases (validate_df_classification$ReachedEvent 1).
## Area under the curve: 0.7097

10.3.2 RF survival Prediction

survival_predictions_test <- predict( survival_model,imputed_test[, feature_names])

predictor <- 1 -
  survival_predictions_test$survival[,which(survival_predictions_test$unique.death.times
                                                           == period_choice)]

pred_survival <- data.frame(
  PatientID = imputed_test$PatientID,
  Event = predictor
)

output_test <- fread("output_test.csv")

submission_surv <- output_test %>%
  select(PatientID, SurvivalTime) %>%
  left_join(pred_survival, by = "PatientID")

submission_surv["Event"] <- ifelse(submission_surv$Event > 0.42, 1,0)

fwrite(submission_surv, "submission_surv.csv")




pred_bin_survival <- as.numeric(predictor > 0.5)

print("ratio of predicted Events in test datset: "); table(pred_bin_survival)
## [1] "ratio of predicted Events in test datset: "
## pred_bin_survival
##  0  1 
## 67 58
print("ratio of Events in traindatset: "); table(train$Event)
## [1] "ratio of Events in traindatset: "
## 
##   0   1 
## 138 162

10.3.3 CoxPH model

#sapply(train_df_official, function(x) sum(is.na(x)))

tr <- train_df_classificaiton %>%
  mutate_if(is.integer, as.numeric) %>%
  mutate(Event = ifelse(Event == 0, 1, 2)) #%>%
   #select(which(sapply(.,is.numeric)))


res.cox <- coxph(survival_formula, data =  tr)


summary(res.cox)
## Call:
## coxph(formula = survival_formula, data = tr)
## 
##   n= 151, number of events= 80 
## 
##                                            coef   exp(coef)    se(coef)      z
## Histology                            -5.797e-03   9.942e-01   9.858e-02 -0.059
## Mstage                                4.333e-01   1.542e+00   3.529e-01  1.228
## Nstage                                6.196e-01   1.858e+00   1.819e-01  3.406
## SourceDataset                        -1.113e+00   3.286e-01   1.485e+00 -0.749
## Tstage                               -3.498e-01   7.048e-01   2.411e-01 -1.451
## age                                   5.479e-02   1.056e+00   2.056e-02  2.665
## shape_Compactness1                    3.078e+04         Inf   3.961e+04  0.777
## shape_Compactness2                   -2.631e+02  5.230e-115   3.740e+02 -0.704
## shape_Maximum3DDiameter               3.260e-02   1.033e+00   1.495e-02  2.181
## shape_SphericalDisproportion         -6.115e+01   2.775e-27   6.672e+01 -0.916
## shape_Sphericity                     -1.766e+03   0.000e+00   2.221e+03 -0.795
## shape_SurfaceArea                     2.328e-04   1.000e+00   1.434e-04  1.623
## shape_SurfaceVolumeRatio              1.475e+01   2.557e+06   8.935e+00  1.651
## shape_VoxelVolume                    -2.941e-05   1.000e+00   1.894e-05 -1.552
## firstorder_Entropy                   -3.371e+01   2.301e-15   1.225e+01 -2.751
## firstorder_Kurtosis                   6.347e-02   1.066e+00   4.481e-02  1.416
## firstorder_Maximum                   -2.742e-03   9.973e-01   1.845e-03 -1.486
## firstorder_Mean                       6.984e-03   1.007e+00   3.387e-02  0.206
## firstorder_MeanAbsoluteDeviation      2.020e-03   1.002e+00   7.495e-02  0.027
## firstorder_Median                    -2.204e-02   9.782e-01   1.713e-02 -1.286
## firstorder_Minimum                    9.557e-03   1.010e+00   2.600e-02  0.368
## firstorder_Range                             NA          NA   0.000e+00     NA
## firstorder_RootMeanSquared           -3.392e-02   9.666e-01   1.959e-02 -1.732
## firstorder_Skewness                   7.280e-01   2.071e+00   5.530e-01  1.316
## firstorder_StandardDeviation          3.563e-02   1.036e+00   6.281e-02  0.567
## firstorder_Uniformity                -1.163e+02   3.124e-51   9.259e+01 -1.256
## firstorder_Variance                   1.868e-04   1.000e+00   1.884e-04  0.991
## glcm_Autocorrelation                  1.831e-02   1.018e+00   2.400e-02  0.763
## glcm_ClusterProminence               -1.956e-06   1.000e+00   4.026e-06 -0.486
## glcm_ClusterShade                    -2.373e-04   9.998e-01   2.042e-04 -1.162
## glcm_ClusterTendency                 -2.179e-02   9.784e-01   2.769e-02 -0.787
## glcm_Contrast                         4.032e-01   1.497e+00   3.658e-01  1.102
## glcm_Correlation                     -1.155e+01   9.612e-06   8.828e+00 -1.309
## glcm_DifferenceEntropy                8.310e+00   4.066e+03   6.158e+00  1.350
## glcm_DifferenceAverage               -2.328e+01   7.756e-11   1.656e+01 -1.406
## glcm_JointEnergy                      1.985e+02   1.675e+86   1.199e+02  1.656
## glcm_JointEntropy                     9.067e+00   8.664e+03   5.628e+00  1.611
## glcm_Id                              -1.372e+03   0.000e+00   1.006e+03 -1.364
## glcm_Idm                              9.665e+02         Inf   7.964e+02  1.214
## glcm_Imc1                            -4.851e+01   8.536e-22   2.510e+01 -1.932
## glcm_Imc2                            -3.449e+00   3.179e-02   1.188e+01 -0.290
## glcm_Idmn                            -3.589e+02  1.298e-156   2.749e+02 -1.306
## glcm_Idn                              2.433e+02  4.510e+105   1.734e+02  1.403
## glcm_InverseVariance                 -5.010e+01   1.754e-22   1.039e+02 -0.482
## glcm_MaximumProbability              -2.376e+01   4.786e-11   1.512e+01 -1.571
## glcm_SumAverage                      -4.526e-01   6.360e-01   4.397e-01 -1.029
## glcm_SumEntropy                       7.811e+00   2.468e+03   7.712e+00  1.013
## glrlm_ShortRunEmphasis                3.310e+01   2.384e+14   4.115e+01  0.805
## glrlm_LongRunEmphasis                -8.420e-01   4.308e-01   9.536e-01 -0.883
## glrlm_GrayLevelNonUniformity          7.652e-05   1.000e+00   9.188e-05  0.833
## glrlm_RunLengthNonUniformity          2.938e-07   1.000e+00   2.161e-05  0.014
## glrlm_RunPercentage                  -1.199e+02   8.732e-53   8.680e+01 -1.381
## glrlm_LowGrayLevelRunEmphasis        -4.078e+02  7.494e-178   6.236e+02 -0.654
## glrlm_HighGrayLevelRunEmphasis       -2.196e-03   9.978e-01   2.740e-02 -0.080
## glrlm_ShortRunLowGrayLevelEmphasis    5.478e+02  8.391e+237   7.039e+02  0.778
## glrlm_ShortRunHighGrayLevelEmphasis   1.913e-03   1.002e+00   1.737e-02  0.110
## glrlm_LongRunLowGrayLevelEmphasis     6.043e+00   4.212e+02   3.135e+01  0.193
## glrlm_LongRunHighGrayLevelEmphasis   -1.846e-04   9.998e-01   5.328e-04 -0.347
##                                     Pr(>|z|)    
## Histology                           0.953109    
## Mstage                              0.219558    
## Nstage                              0.000659 ***
## SourceDataset                       0.453630    
## Tstage                              0.146770    
## age                                 0.007702 ** 
## shape_Compactness1                  0.437107    
## shape_Compactness2                  0.481730    
## shape_Maximum3DDiameter             0.029198 *  
## shape_SphericalDisproportion        0.359411    
## shape_Sphericity                    0.426525    
## shape_SurfaceArea                   0.104510    
## shape_SurfaceVolumeRatio            0.098677 .  
## shape_VoxelVolume                   0.120607    
## firstorder_Entropy                  0.005933 ** 
## firstorder_Kurtosis                 0.156668    
## firstorder_Maximum                  0.137265    
## firstorder_Mean                     0.836644    
## firstorder_MeanAbsoluteDeviation    0.978502    
## firstorder_Median                   0.198289    
## firstorder_Minimum                  0.713172    
## firstorder_Range                          NA    
## firstorder_RootMeanSquared          0.083272 .  
## firstorder_Skewness                 0.188034    
## firstorder_StandardDeviation        0.570550    
## firstorder_Uniformity               0.209134    
## firstorder_Variance                 0.321452    
## glcm_Autocorrelation                0.445304    
## glcm_ClusterProminence              0.626993    
## glcm_ClusterShade                   0.245185    
## glcm_ClusterTendency                0.431256    
## glcm_Contrast                       0.270354    
## glcm_Correlation                    0.190681    
## glcm_DifferenceEntropy              0.177133    
## glcm_DifferenceAverage              0.159747    
## glcm_JointEnergy                    0.097787 .  
## glcm_JointEntropy                   0.107165    
## glcm_Id                             0.172725    
## glcm_Idm                            0.224908    
## glcm_Imc1                           0.053311 .  
## glcm_Imc2                           0.771574    
## glcm_Idmn                           0.191627    
## glcm_Idn                            0.160661    
## glcm_InverseVariance                0.629627    
## glcm_MaximumProbability             0.116116    
## glcm_SumAverage                     0.303326    
## glcm_SumEntropy                     0.311145    
## glrlm_ShortRunEmphasis              0.421085    
## glrlm_LongRunEmphasis               0.377249    
## glrlm_GrayLevelNonUniformity        0.404953    
## glrlm_RunLengthNonUniformity        0.989153    
## glrlm_RunPercentage                 0.167266    
## glrlm_LowGrayLevelRunEmphasis       0.513083    
## glrlm_HighGrayLevelRunEmphasis      0.936131    
## glrlm_ShortRunLowGrayLevelEmphasis  0.436394    
## glrlm_ShortRunHighGrayLevelEmphasis 0.912313    
## glrlm_LongRunLowGrayLevelEmphasis   0.847143    
## glrlm_LongRunHighGrayLevelEmphasis  0.728912    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
##                                      exp(coef) exp(-coef)  lower .95  upper .95
## Histology                            9.942e-01  1.006e+00  8.195e-01  1.206e+00
## Mstage                               1.542e+00  6.484e-01  7.723e-01  3.080e+00
## Nstage                               1.858e+00  5.382e-01  1.301e+00  2.654e+00
## SourceDataset                        3.286e-01  3.043e+00  1.789e-02  6.036e+00
## Tstage                               7.048e-01  1.419e+00  4.394e-01  1.131e+00
## age                                  1.056e+00  9.467e-01  1.015e+00  1.100e+00
## shape_Compactness1                         Inf  0.000e+00  0.000e+00        Inf
## shape_Compactness2                  5.230e-115 1.912e+114  0.000e+00 1.255e+204
## shape_Maximum3DDiameter              1.033e+00  9.679e-01  1.003e+00  1.064e+00
## shape_SphericalDisproportion         2.775e-27  3.604e+26  4.466e-84  1.724e+30
## shape_Sphericity                     0.000e+00        Inf  0.000e+00        Inf
## shape_SurfaceArea                    1.000e+00  9.998e-01  1.000e+00  1.001e+00
## shape_SurfaceVolumeRatio             2.557e+06  3.911e-07  6.343e-02  1.031e+14
## shape_VoxelVolume                    1.000e+00  1.000e+00  9.999e-01  1.000e+00
## firstorder_Entropy                   2.301e-15  4.347e+14  8.604e-26  6.152e-05
## firstorder_Kurtosis                  1.066e+00  9.385e-01  9.759e-01  1.163e+00
## firstorder_Maximum                   9.973e-01  1.003e+00  9.937e-01  1.001e+00
## firstorder_Mean                      1.007e+00  9.930e-01  9.423e-01  1.076e+00
## firstorder_MeanAbsoluteDeviation     1.002e+00  9.980e-01  8.651e-01  1.161e+00
## firstorder_Median                    9.782e-01  1.022e+00  9.459e-01  1.012e+00
## firstorder_Minimum                   1.010e+00  9.905e-01  9.594e-01  1.062e+00
## firstorder_Range                            NA         NA         NA         NA
## firstorder_RootMeanSquared           9.666e-01  1.035e+00  9.302e-01  1.004e+00
## firstorder_Skewness                  2.071e+00  4.829e-01  7.006e-01  6.122e+00
## firstorder_StandardDeviation         1.036e+00  9.650e-01  9.162e-01  1.172e+00
## firstorder_Uniformity                3.124e-51  3.201e+50 4.776e-130  2.043e+28
## firstorder_Variance                  1.000e+00  9.998e-01  9.998e-01  1.001e+00
## glcm_Autocorrelation                 1.018e+00  9.819e-01  9.717e-01  1.068e+00
## glcm_ClusterProminence               1.000e+00  1.000e+00  1.000e+00  1.000e+00
## glcm_ClusterShade                    9.998e-01  1.000e+00  9.994e-01  1.000e+00
## glcm_ClusterTendency                 9.784e-01  1.022e+00  9.268e-01  1.033e+00
## glcm_Contrast                        1.497e+00  6.682e-01  7.307e-01  3.065e+00
## glcm_Correlation                     9.612e-06  1.040e+05  2.938e-13  3.144e+02
## glcm_DifferenceEntropy               4.066e+03  2.459e-04  2.333e-02  7.088e+08
## glcm_DifferenceAverage               7.756e-11  1.289e+10  6.238e-25  9.644e+03
## glcm_JointEnergy                     1.675e+86  5.970e-87  1.424e-16 1.971e+188
## glcm_JointEntropy                    8.664e+03  1.154e-04  1.404e-01  5.349e+08
## glcm_Id                              0.000e+00        Inf  0.000e+00 4.144e+260
## glcm_Idm                                   Inf  0.000e+00 7.064e-259        Inf
## glcm_Imc1                            8.536e-22  1.172e+21  3.646e-43  1.998e+00
## glcm_Imc2                            3.179e-02  3.146e+01  2.458e-12  4.110e+08
## glcm_Idmn                           1.298e-156 7.706e+155  0.000e+00  1.252e+78
## glcm_Idn                            4.510e+105 2.217e-106  1.101e-42 1.847e+253
## glcm_InverseVariance                 1.754e-22  5.701e+21 6.663e-111  4.617e+66
## glcm_MaximumProbability              4.786e-11  2.090e+10  6.413e-24  3.571e+02
## glcm_SumAverage                      6.360e-01  1.572e+00  2.686e-01  1.506e+00
## glcm_SumEntropy                      2.468e+03  4.052e-04  6.724e-04  9.057e+09
## glrlm_ShortRunEmphasis               2.384e+14  4.195e-15  2.251e-21  2.525e+49
## glrlm_LongRunEmphasis                4.308e-01  2.321e+00  6.646e-02  2.793e+00
## glrlm_GrayLevelNonUniformity         1.000e+00  9.999e-01  9.999e-01  1.000e+00
## glrlm_RunLengthNonUniformity         1.000e+00  1.000e+00  1.000e+00  1.000e+00
## glrlm_RunPercentage                  8.732e-53  1.145e+52 1.147e-126  6.647e+21
## glrlm_LowGrayLevelRunEmphasis       7.494e-178 1.334e+177  0.000e+00        Inf
## glrlm_HighGrayLevelRunEmphasis       9.978e-01  1.002e+00  9.456e-01  1.053e+00
## glrlm_ShortRunLowGrayLevelEmphasis  8.391e+237 1.192e-238  0.000e+00        Inf
## glrlm_ShortRunHighGrayLevelEmphasis  1.002e+00  9.981e-01  9.684e-01  1.037e+00
## glrlm_LongRunLowGrayLevelEmphasis    4.212e+02  2.374e-03  8.700e-25  2.039e+29
## glrlm_LongRunHighGrayLevelEmphasis   9.998e-01  1.000e+00  9.988e-01  1.001e+00
## 
## Concordance= 0.824  (se = 0.026 )
## Likelihood ratio test= 132.2  on 57 df,   p=7e-08
## Wald test            = 55.79  on 57 df,   p=0.5
## Score (logrank) test = 134.4  on 57 df,   p=3e-08
  • restart the model of the most significant
cox2 <- coxph(Surv(SurvivalTime, Event) ~ Mstage + Nstage + SourceDataset + age + shape_Compactness1 + shape_Compactness2 + shape_SphericalDisproportion +shape_Sphericity+ glrlm_ShortRunEmphasis, data = tr )

summary(cox2)
## Call:
## coxph(formula = Surv(SurvivalTime, Event) ~ Mstage + Nstage + 
##     SourceDataset + age + shape_Compactness1 + shape_Compactness2 + 
##     shape_SphericalDisproportion + shape_Sphericity + glrlm_ShortRunEmphasis, 
##     data = tr)
## 
##   n= 151, number of events= 80 
## 
##                                     coef   exp(coef)    se(coef)      z
## Mstage                         2.813e-01   1.325e+00   2.461e-01  1.143
## Nstage                         2.949e-01   1.343e+00   1.096e-01  2.691
## SourceDataset                 -6.354e-01   5.297e-01   3.725e-01 -1.706
## age                            4.591e-02   1.047e+00   1.334e-02  3.442
## shape_Compactness1             3.849e+02  1.439e+167   2.648e+04  0.015
## shape_Compactness2             4.138e+00   6.265e+01   2.515e+02  0.016
## shape_SphericalDisproportion  -3.083e+00   4.583e-02   4.440e+01 -0.069
## shape_Sphericity              -3.712e+01   7.583e-17   1.484e+03 -0.025
## glrlm_ShortRunEmphasis        -3.916e+00   1.991e-02   1.383e+00 -2.832
##                              Pr(>|z|)    
## Mstage                       0.253016    
## Nstage                       0.007124 ** 
## SourceDataset                0.088075 .  
## age                          0.000577 ***
## shape_Compactness1           0.988404    
## shape_Compactness2           0.986872    
## shape_SphericalDisproportion 0.944653    
## shape_Sphericity             0.980040    
## glrlm_ShortRunEmphasis       0.004633 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
##                               exp(coef) exp(-coef)  lower .95  upper .95
## Mstage                        1.325e+00  7.548e-01  8.179e-01  2.146e+00
## Nstage                        1.343e+00  7.446e-01  1.083e+00  1.665e+00
## SourceDataset                 5.297e-01  1.888e+00  2.553e-01  1.099e+00
## age                           1.047e+00  9.551e-01  1.020e+00  1.075e+00
## shape_Compactness1           1.439e+167 6.949e-168  0.000e+00        Inf
## shape_Compactness2            6.265e+01  1.596e-02 5.732e-213 6.848e+215
## shape_SphericalDisproportion  4.583e-02  2.182e+01  7.304e-40  2.876e+36
## shape_Sphericity              7.583e-17  1.319e+16  0.000e+00        Inf
## glrlm_ShortRunEmphasis        1.991e-02  5.022e+01  1.324e-03  2.995e-01
## 
## Concordance= 0.721  (se = 0.028 )
## Likelihood ratio test= 46.96  on 9 df,   p=4e-07
## Wald test            = 41  on 9 df,   p=5e-06
## Score (logrank) test = 49.23  on 9 df,   p=2e-07

10.3.3.1 Testing proportional Hazards assumption

test.cox2 <- cox.zph(cox2)
test.cox2
##                                 chisq df      p
## Mstage                        9.73549  1 0.0018
## Nstage                        0.81714  1 0.3660
## SourceDataset                 0.00128  1 0.9714
## age                           0.07673  1 0.7818
## shape_Compactness1            1.08386  1 0.2978
## shape_Compactness2            0.97642  1 0.3231
## shape_SphericalDisproportion  0.99967  1 0.3174
## shape_Sphericity              1.09885  1 0.2945
## glrlm_ShortRunEmphasis        0.25219  1 0.6155
## GLOBAL                       14.81042  9 0.0963
require("survminer")

ggcoxzph(test.cox2)

  • In the figure above, the solid line is a smoothing spline fit to the plot, with the dashed lines representing a +/- 2-standard-error band around the fit.
#Var <- stringr::str_split(string = var, pattern = "\\+", simplify = TRUE, )
#coxph.fit(tr %>% select(-Event), tr$Event)


fit <- survfit(cox2, newdata = validate_df_classification, data = train_df_classificaiton)

#ggsurvplot(fit, conf.int = TRUE, palette = "Dark2", 
#           censor = FALSE)

summary(fit)
## Call: survfit(formula = cox2, newdata = validate_df_classification, 
##     data = train_df_classificaiton)
## 
##  time n.risk n.event survival1 survival2 survival3 survival4 survival5
##    24    150       1     0.994    0.9881     0.996    0.9875     0.990
##    25    149       1     0.988    0.9763     0.992    0.9751     0.979
##    48    148       1     0.982    0.9645     0.987    0.9627     0.969
##    65    147       1     0.976    0.9527     0.983    0.9504     0.959
##    71    146       1     0.970    0.9411     0.979    0.9382     0.948
##    76    145       1     0.964    0.9294     0.974    0.9259     0.938
##    77    144       1     0.957    0.9176     0.970    0.9136     0.928
##    78    143       1     0.951    0.9059     0.966    0.9014     0.917
##    79    142       1     0.945    0.8942     0.961    0.8891     0.907
##    86    141       1     0.939    0.8825     0.957    0.8769     0.896
##    96    140       1     0.932    0.8710     0.952    0.8649     0.886
##    98    139       1     0.926    0.8594     0.948    0.8529     0.876
##   113    138       1     0.920    0.8479     0.943    0.8409     0.866
##   120    137       2     0.907    0.8249     0.934    0.8169     0.845
##   131    135       1     0.901    0.8135     0.929    0.8050     0.835
##   132    134       1     0.894    0.8016     0.925    0.7926     0.824
##   136    133       1     0.887    0.7897     0.920    0.7803     0.813
##   141    132       1     0.880    0.7778     0.915    0.7680     0.803
##   143    131       1     0.874    0.7661     0.910    0.7558     0.792
##   159    130       1     0.867    0.7544     0.905    0.7437     0.781
##   166    129       1     0.860    0.7429     0.900    0.7318     0.771
##   177    128       1     0.853    0.7314     0.895    0.7199     0.760
##   183    127       1     0.847    0.7198     0.890    0.7079     0.750
##   192    126       1     0.840    0.7083     0.885    0.6960     0.739
##   193    125       1     0.833    0.6968     0.880    0.6841     0.729
##   195    124       1     0.825    0.6848     0.874    0.6718     0.718
##   210    123       1     0.818    0.6730     0.869    0.6596     0.707
##   213    122       1     0.811    0.6612     0.863    0.6475     0.696
##   220    121       1     0.803    0.6494     0.858    0.6354     0.685
##   258    119       1     0.796    0.6375     0.852    0.6231     0.674
##   259    117       1     0.788    0.6251     0.846    0.6104     0.663
##   261    116       1     0.780    0.6129     0.841    0.5979     0.651
##   265    115       1     0.772    0.6008     0.835    0.5855     0.640
##   287    114       1     0.765    0.5889     0.829    0.5733     0.629
##   291    113       1     0.757    0.5770     0.823    0.5612     0.618
##   303    111       1     0.749    0.5649     0.817    0.5488     0.607
##   321    108       1     0.740    0.5522     0.810    0.5359     0.595
##   325    107       1     0.731    0.5394     0.803    0.5228     0.583
##   336    106       1     0.723    0.5268     0.797    0.5099     0.571
##   340    105       1     0.714    0.5142     0.790    0.4972     0.559
##   342    104       1     0.705    0.5019     0.783    0.4846     0.547
##   360    103       1     0.696    0.4896     0.776    0.4723     0.535
##   370    102       1     0.688    0.4776     0.769    0.4601     0.524
##   377    100       1     0.679    0.4657     0.762    0.4480     0.512
##   401     99       1     0.670    0.4535     0.755    0.4357     0.500
##   409     98       1     0.661    0.4416     0.748    0.4237     0.489
##   426     97       1     0.652    0.4298     0.741    0.4119     0.478
##   433     96       1     0.643    0.4183     0.734    0.4002     0.466
##   456     95       1     0.634    0.4069     0.727    0.3888     0.455
##   463     94       1     0.625    0.3957     0.720    0.3775     0.444
##   465     93       1     0.616    0.3847     0.712    0.3665     0.433
##   472     91       1     0.607    0.3735     0.705    0.3553     0.422
##   508     88       1     0.598    0.3621     0.697    0.3440     0.411
##   512     87       1     0.588    0.3509     0.690    0.3328     0.400
##   515     86       2     0.569    0.3292     0.674    0.3112     0.378
##   527     81       1     0.559    0.3179     0.666    0.3000     0.367
##   539     80       1     0.549    0.3068     0.657    0.2889     0.355
##   575     77       1     0.539    0.2950     0.648    0.2773     0.343
##   618     75       1     0.527    0.2831     0.639    0.2655     0.331
##   632     73       1     0.516    0.2712     0.629    0.2538     0.319
##   635     72       1     0.505    0.2597     0.620    0.2425     0.307
##   639     71       1     0.494    0.2484     0.610    0.2315     0.296
##   646     70       1     0.483    0.2375     0.600    0.2208     0.284
##   685     68       1     0.471    0.2264     0.590    0.2100     0.272
##   703     66       1     0.459    0.2152     0.580    0.1991     0.261
##   754     63       1     0.446    0.2034     0.568    0.1876     0.248
##   815     59       1     0.432    0.1914     0.556    0.1760     0.235
##   907     53       1     0.417    0.1783     0.542    0.1634     0.221
##   911     52       1     0.402    0.1657     0.528    0.1513     0.207
##   991     51       1     0.387    0.1539     0.515    0.1400     0.194
##  1028     49       1     0.372    0.1425     0.501    0.1291     0.182
##  1076     48       1     0.358    0.1318     0.487    0.1189     0.170
##  1141     47       1     0.344    0.1215     0.473    0.1092     0.158
##  1213     41       1     0.327    0.1101     0.457    0.0984     0.145
##  1280     37       1     0.309    0.0988     0.440    0.0878     0.132
##  1295     35       1     0.292    0.0882     0.422    0.0780     0.119
##  1369     32       1     0.272    0.0769     0.402    0.0675     0.106
##  1548     22       1     0.240    0.0598     0.368    0.0519     0.085
##  survival6 survival7 survival8 survival9 survival10 survival11 survival12
##      0.991     0.995     0.997     0.992      0.997      0.997      0.994
##      0.983     0.989     0.995     0.985      0.994      0.995      0.989
##      0.974     0.984     0.992     0.977      0.990      0.992      0.983
##      0.966     0.978     0.989     0.969      0.987      0.989      0.977
##      0.957     0.973     0.987     0.961      0.984      0.986      0.971
##      0.949     0.967     0.984     0.954      0.981      0.983      0.966
##      0.940     0.961     0.981     0.946      0.977      0.980      0.960
##      0.932     0.956     0.978     0.938      0.974      0.978      0.954
##      0.923     0.950     0.976     0.930      0.971      0.975      0.948
##      0.914     0.944     0.973     0.922      0.967      0.972      0.942
##      0.906     0.939     0.970     0.914      0.964      0.969      0.936
##      0.897     0.933     0.967     0.906      0.960      0.966      0.930
##      0.889     0.927     0.964     0.898      0.957      0.963      0.924
##      0.871     0.916     0.958     0.883      0.950      0.957      0.912
##      0.863     0.910     0.955     0.875      0.946      0.954      0.906
##      0.854     0.904     0.952     0.866      0.943      0.951      0.899
##      0.845     0.898     0.949     0.858      0.939      0.947      0.893
##      0.836     0.892     0.946     0.849      0.935      0.944      0.887
##      0.826     0.885     0.943     0.841      0.931      0.941      0.880
##      0.817     0.879     0.940     0.833      0.928      0.937      0.874
##      0.808     0.873     0.936     0.825      0.924      0.934      0.867
##      0.800     0.867     0.933     0.816      0.920      0.931      0.861
##      0.790     0.861     0.930     0.808      0.916      0.927      0.854
##      0.781     0.854     0.927     0.799      0.912      0.924      0.848
##      0.772     0.848     0.923     0.791      0.908      0.920      0.841
##      0.763     0.841     0.920     0.782      0.904      0.917      0.834
##      0.753     0.834     0.916     0.773      0.900      0.913      0.827
##      0.744     0.828     0.913     0.764      0.896      0.910      0.820
##      0.734     0.821     0.909     0.756      0.891      0.906      0.813
##      0.725     0.814     0.905     0.747      0.887      0.902      0.806
##      0.715     0.807     0.901     0.737      0.882      0.898      0.798
##      0.705     0.800     0.897     0.728      0.878      0.894      0.791
##      0.695     0.792     0.894     0.718      0.873      0.890      0.783
##      0.685     0.785     0.890     0.709      0.868      0.886      0.776
##      0.675     0.778     0.886     0.700      0.864      0.882      0.768
##      0.665     0.770     0.881     0.690      0.859      0.877      0.761
##      0.654     0.762     0.877     0.680      0.854      0.873      0.752
##      0.643     0.754     0.872     0.670      0.848      0.868      0.744
##      0.632     0.746     0.868     0.660      0.843      0.863      0.736
##      0.621     0.738     0.863     0.649      0.838      0.859      0.727
##      0.611     0.730     0.859     0.639      0.832      0.854      0.719
##      0.600     0.722     0.854     0.629      0.827      0.849      0.710
##      0.589     0.713     0.849     0.619      0.821      0.844      0.702
##      0.579     0.705     0.845     0.609      0.816      0.839      0.693
##      0.568     0.697     0.840     0.598      0.810      0.834      0.685
##      0.557     0.688     0.835     0.588      0.804      0.829      0.676
##      0.547     0.680     0.830     0.578      0.799      0.824      0.667
##      0.536     0.671     0.825     0.568      0.793      0.819      0.659
##      0.526     0.663     0.820     0.558      0.787      0.814      0.650
##      0.515     0.655     0.815     0.548      0.781      0.808      0.641
##      0.505     0.646     0.810     0.538      0.775      0.803      0.633
##      0.494     0.638     0.804     0.528      0.769      0.798      0.624
##      0.484     0.629     0.799     0.517      0.763      0.792      0.615
##      0.473     0.620     0.793     0.507      0.757      0.787      0.605
##      0.452     0.602     0.782     0.486      0.744      0.775      0.587
##      0.441     0.592     0.776     0.475      0.737      0.769      0.577
##      0.429     0.583     0.770     0.464      0.730      0.763      0.568
##      0.418     0.572     0.763     0.453      0.722      0.756      0.557
##      0.405     0.562     0.757     0.441      0.715      0.749      0.546
##      0.393     0.551     0.749     0.429      0.706      0.741      0.535
##      0.381     0.540     0.742     0.417      0.698      0.734      0.524
##      0.369     0.529     0.735     0.405      0.690      0.727      0.513
##      0.358     0.518     0.728     0.393      0.682      0.719      0.502
##      0.346     0.507     0.720     0.381      0.673      0.711      0.491
##      0.333     0.496     0.712     0.369      0.664      0.703      0.479
##      0.320     0.483     0.703     0.356      0.654      0.694      0.466
##      0.306     0.470     0.694     0.342      0.644      0.684      0.453
##      0.291     0.455     0.683     0.326      0.632      0.673      0.438
##      0.276     0.440     0.672     0.311      0.620      0.662      0.423
##      0.262     0.425     0.661     0.297      0.607      0.651      0.408
##      0.248     0.410     0.650     0.282      0.595      0.640      0.393
##      0.235     0.396     0.639     0.268      0.583      0.628      0.379
##      0.221     0.382     0.628     0.254      0.570      0.617      0.364
##      0.206     0.365     0.614     0.239      0.556      0.603      0.347
##      0.191     0.347     0.599     0.222      0.540      0.588      0.330
##      0.176     0.330     0.585     0.207      0.524      0.573      0.312
##      0.160     0.310     0.567     0.189      0.505      0.555      0.292
##      0.133     0.276     0.537     0.161      0.472      0.524      0.259
##  survival13 survival14 survival15 survival16 survival17 survival18 survival19
##       0.998      0.993      0.997      0.999      0.999      0.999      0.997
##       0.996      0.986      0.994      0.999      0.997      0.997      0.994
##       0.994      0.979      0.991      0.998      0.996      0.996      0.991
##       0.991      0.972      0.988      0.997      0.995      0.994      0.988
##       0.989      0.965      0.986      0.997      0.994      0.993      0.985
##       0.987      0.958      0.983      0.996      0.992      0.991      0.982
##       0.985      0.951      0.980      0.995      0.991      0.990      0.979
##       0.982      0.943      0.977      0.995      0.990      0.988      0.976
##       0.980      0.936      0.974      0.994      0.988      0.986      0.973
##       0.978      0.929      0.971      0.993      0.987      0.985      0.970
##       0.976      0.922      0.968      0.992      0.986      0.983      0.967
##       0.973      0.915      0.964      0.992      0.984      0.982      0.964
##       0.971      0.907      0.961      0.991      0.983      0.980      0.960
##       0.966      0.893      0.955      0.990      0.980      0.977      0.954
##       0.964      0.885      0.952      0.989      0.979      0.975      0.951
##       0.961      0.878      0.948      0.988      0.977      0.973      0.947
##       0.959      0.870      0.945      0.987      0.976      0.972      0.944
##       0.956      0.862      0.942      0.986      0.974      0.970      0.940
##       0.953      0.855      0.938      0.986      0.972      0.968      0.937
##       0.951      0.847      0.935      0.985      0.971      0.966      0.933
##       0.948      0.839      0.931      0.984      0.969      0.964      0.930
##       0.946      0.831      0.928      0.983      0.968      0.962      0.926
##       0.943      0.824      0.924      0.982      0.966      0.961      0.923
##       0.940      0.816      0.921      0.981      0.964      0.959      0.919
##       0.937      0.808      0.917      0.980      0.963      0.957      0.915
##       0.934      0.800      0.913      0.979      0.961      0.955      0.911
##       0.932      0.792      0.910      0.979      0.959      0.953      0.908
##       0.929      0.783      0.906      0.978      0.958      0.951      0.904
##       0.926      0.775      0.902      0.977      0.956      0.949      0.900
##       0.923      0.767      0.898      0.976      0.954      0.946      0.896
##       0.919      0.758      0.894      0.975      0.952      0.944      0.891
##       0.916      0.749      0.890      0.974      0.950      0.942      0.887
##       0.913      0.740      0.885      0.972      0.948      0.940      0.883
##       0.910      0.732      0.881      0.971      0.946      0.937      0.878
##       0.906      0.723      0.877      0.970      0.944      0.935      0.874
##       0.903      0.714      0.872      0.969      0.942      0.933      0.870
##       0.899      0.704      0.868      0.968      0.940      0.930      0.865
##       0.895      0.695      0.863      0.967      0.937      0.927      0.860
##       0.892      0.685      0.858      0.966      0.935      0.925      0.855
##       0.888      0.675      0.853      0.964      0.933      0.922      0.850
##       0.884      0.666      0.848      0.963      0.930      0.919      0.845
##       0.880      0.656      0.843      0.962      0.928      0.916      0.840
##       0.876      0.647      0.838      0.960      0.925      0.914      0.834
##       0.872      0.637      0.833      0.959      0.923      0.911      0.829
##       0.868      0.627      0.828      0.958      0.920      0.908      0.824
##       0.864      0.617      0.822      0.956      0.918      0.905      0.819
##       0.860      0.608      0.817      0.955      0.915      0.902      0.813
##       0.856      0.598      0.812      0.953      0.913      0.899      0.808
##       0.851      0.588      0.807      0.952      0.910      0.896      0.802
##       0.847      0.579      0.801      0.951      0.907      0.893      0.797
##       0.843      0.569      0.796      0.949      0.905      0.890      0.791
##       0.838      0.559      0.790      0.947      0.902      0.887      0.786
##       0.834      0.549      0.784      0.946      0.899      0.883      0.780
##       0.829      0.539      0.778      0.944      0.896      0.880      0.774
##       0.820      0.519      0.767      0.941      0.890      0.873      0.762
##       0.815      0.509      0.760      0.939      0.887      0.869      0.755
##       0.809      0.498      0.754      0.937      0.883      0.866      0.749
##       0.804      0.487      0.747      0.935      0.880      0.861      0.742
##       0.798      0.475      0.739      0.933      0.876      0.857      0.734
##       0.792      0.463      0.732      0.931      0.872      0.853      0.726
##       0.786      0.451      0.724      0.929      0.868      0.848      0.719
##       0.779      0.440      0.717      0.927      0.864      0.844      0.711
##       0.773      0.428      0.709      0.924      0.860      0.839      0.703
##       0.767      0.416      0.701      0.922      0.856      0.834      0.695
##       0.760      0.404      0.693      0.919      0.851      0.829      0.686
##       0.752      0.391      0.683      0.916      0.846      0.823      0.677
##       0.744      0.377      0.673      0.913      0.841      0.817      0.667
##       0.734      0.362      0.662      0.910      0.834      0.810      0.656
##       0.725      0.346      0.651      0.906      0.828      0.803      0.644
##       0.715      0.332      0.639      0.903      0.822      0.796      0.632
##       0.706      0.317      0.627      0.899      0.815      0.788      0.621
##       0.696      0.303      0.616      0.895      0.808      0.781      0.609
##       0.686      0.288      0.604      0.891      0.801      0.773      0.597
##       0.674      0.272      0.590      0.886      0.793      0.764      0.583
##       0.661      0.255      0.575      0.881      0.784      0.754      0.567
##       0.648      0.239      0.560      0.876      0.775      0.743      0.552
##       0.632      0.220      0.541      0.869      0.764      0.731      0.534
##       0.604      0.190      0.510      0.857      0.744      0.709      0.502
##  survival20 survival21 survival22 survival23 survival24 survival25 survival26
##       0.997      0.997      0.999      0.994      0.994      0.998      0.994
##       0.995      0.993      0.997      0.988      0.988      0.995      0.989
##       0.992      0.990      0.996      0.982      0.981      0.993      0.983
##       0.990      0.986      0.995      0.976      0.975      0.990      0.977
##       0.987      0.983      0.994      0.970      0.969      0.987      0.971
##       0.984      0.979      0.992      0.964      0.963      0.985      0.966
##       0.981      0.976      0.991      0.958      0.956      0.982      0.960
##       0.979      0.972      0.990      0.952      0.950      0.980      0.954
##       0.976      0.968      0.988      0.946      0.944      0.977      0.948
##       0.973      0.965      0.987      0.940      0.937      0.974      0.942
##       0.970      0.961      0.986      0.934      0.931      0.972      0.936
##       0.968      0.957      0.984      0.928      0.924      0.969      0.930
##       0.965      0.954      0.983      0.921      0.918      0.966      0.924
##       0.959      0.946      0.980      0.909      0.905      0.961      0.912
##       0.956      0.942      0.979      0.903      0.898      0.958      0.906
##       0.953      0.938      0.977      0.896      0.892      0.955      0.899
##       0.950      0.934      0.976      0.889      0.885      0.952      0.893
##       0.947      0.930      0.974      0.883      0.878      0.949      0.887
##       0.944      0.926      0.972      0.876      0.871      0.946      0.880
##       0.941      0.922      0.971      0.869      0.864      0.943      0.874
##       0.937      0.918      0.969      0.863      0.857      0.940      0.867
##       0.934      0.914      0.968      0.856      0.850      0.937      0.861
##       0.931      0.910      0.966      0.849      0.843      0.934      0.854
##       0.928      0.905      0.965      0.843      0.836      0.931      0.848
##       0.925      0.901      0.963      0.836      0.829      0.928      0.841
##       0.921      0.897      0.961      0.829      0.822      0.925      0.834
##       0.918      0.892      0.959      0.821      0.814      0.921      0.827
##       0.914      0.888      0.958      0.814      0.807      0.918      0.820
##       0.910      0.883      0.956      0.807      0.799      0.914      0.813
##       0.907      0.878      0.954      0.800      0.792      0.911      0.806
##       0.903      0.873      0.952      0.792      0.784      0.907      0.798
##       0.899      0.868      0.950      0.784      0.776      0.904      0.791
##       0.895      0.863      0.948      0.777      0.768      0.900      0.783
##       0.891      0.858      0.946      0.769      0.760      0.896      0.776
##       0.887      0.853      0.944      0.761      0.752      0.892      0.768
##       0.883      0.848      0.942      0.753      0.744      0.888      0.761
##       0.879      0.843      0.940      0.745      0.735      0.884      0.752
##       0.875      0.837      0.937      0.736      0.726      0.880      0.744
##       0.870      0.831      0.935      0.727      0.717      0.876      0.736
##       0.865      0.825      0.933      0.719      0.708      0.871      0.727
##       0.861      0.820      0.930      0.710      0.699      0.867      0.719
##       0.856      0.814      0.928      0.702      0.690      0.862      0.710
##       0.852      0.808      0.926      0.693      0.681      0.858      0.702
##       0.847      0.802      0.923      0.684      0.673      0.854      0.693
##       0.842      0.796      0.921      0.675      0.663      0.849      0.685
##       0.837      0.790      0.918      0.666      0.654      0.844      0.676
##       0.832      0.784      0.915      0.658      0.645      0.840      0.667
##       0.828      0.778      0.913      0.649      0.636      0.835      0.659
##       0.823      0.772      0.910      0.640      0.627      0.830      0.650
##       0.818      0.765      0.907      0.631      0.618      0.825      0.641
##       0.813      0.759      0.905      0.622      0.609      0.820      0.633
##       0.807      0.753      0.902      0.613      0.600      0.815      0.624
##       0.802      0.746      0.899      0.604      0.590      0.810      0.615
##       0.797      0.739      0.896      0.595      0.581      0.805      0.606
##       0.786      0.726      0.890      0.576      0.562      0.794      0.587
##       0.780      0.719      0.887      0.566      0.552      0.789      0.578
##       0.774      0.711      0.884      0.556      0.542      0.783      0.568
##       0.767      0.703      0.880      0.545      0.531      0.777      0.557
##       0.760      0.695      0.876      0.534      0.519      0.770      0.546
##       0.753      0.686      0.872      0.523      0.508      0.763      0.535
##       0.746      0.678      0.868      0.512      0.497      0.756      0.524
##       0.739      0.669      0.864      0.501      0.485      0.749      0.513
##       0.732      0.661      0.860      0.490      0.474      0.742      0.502
##       0.724      0.652      0.856      0.478      0.463      0.735      0.491
##       0.716      0.642      0.851      0.466      0.451      0.727      0.479
##       0.708      0.632      0.846      0.454      0.438      0.719      0.466
##       0.698      0.621      0.841      0.440      0.424      0.710      0.453
##       0.688      0.608      0.835      0.425      0.409      0.700      0.438
##       0.677      0.596      0.828      0.410      0.393      0.689      0.423
##       0.666      0.583      0.822      0.395      0.379      0.679      0.408
##       0.655      0.570      0.815      0.380      0.364      0.668      0.393
##       0.644      0.557      0.809      0.366      0.349      0.657      0.379
##       0.633      0.545      0.802      0.351      0.335      0.646      0.364
##       0.619      0.529      0.794      0.334      0.318      0.633      0.348
##       0.605      0.513      0.785      0.317      0.301      0.619      0.330
##       0.590      0.497      0.775      0.300      0.284      0.605      0.313
##       0.573      0.477      0.764      0.280      0.264      0.588      0.293
##       0.542      0.444      0.745      0.247      0.232      0.558      0.260
##  survival27 survival28 survival29 survival30 survival31 survival32 survival33
##       0.998      0.996     0.9845      0.998      0.998      0.999      0.992
##       0.995      0.992     0.9691      0.997      0.996      0.997      0.984
##       0.993      0.988     0.9538      0.995      0.994      0.996      0.976
##       0.990      0.984     0.9387      0.994      0.992      0.994      0.969
##       0.988      0.980     0.9237      0.992      0.990      0.993      0.961
##       0.985      0.976     0.9087      0.990      0.988      0.992      0.953
##       0.983      0.972     0.8938      0.988      0.986      0.990      0.945
##       0.980      0.967     0.8789      0.987      0.984      0.989      0.937
##       0.978      0.963     0.8641      0.985      0.982      0.987      0.929
##       0.975      0.959     0.8493      0.983      0.980      0.986      0.921
##       0.973      0.955     0.8348      0.982      0.978      0.984      0.913
##       0.970      0.950     0.8204      0.980      0.976      0.983      0.905
##       0.968      0.946     0.8061      0.978      0.973      0.981      0.897
##       0.962      0.937     0.7776      0.974      0.969      0.978      0.881
##       0.960      0.933     0.7636      0.973      0.967      0.976      0.872
##       0.957      0.928     0.7490      0.971      0.965      0.975      0.864
##       0.954      0.924     0.7345      0.969      0.962      0.973      0.855
##       0.951      0.919     0.7202      0.967      0.960      0.971      0.847
##       0.948      0.914     0.7059      0.965      0.958      0.969      0.838
##       0.945      0.910     0.6920      0.963      0.955      0.968      0.830
##       0.942      0.905     0.6782      0.961      0.953      0.966      0.822
##       0.939      0.900     0.6645      0.959      0.950      0.964      0.813
##       0.936      0.895     0.6508      0.957      0.948      0.962      0.805
##       0.933      0.891     0.6372      0.955      0.945      0.961      0.796
##       0.930      0.886     0.6237      0.952      0.943      0.959      0.788
##       0.927      0.880     0.6097      0.950      0.940      0.957      0.779
##       0.924      0.875     0.5960      0.948      0.938      0.955      0.770
##       0.921      0.870     0.5825      0.946      0.935      0.953      0.761
##       0.917      0.865     0.5689      0.944      0.932      0.951      0.752
##       0.914      0.860     0.5553      0.941      0.929      0.949      0.743
##       0.910      0.854     0.5412      0.939      0.926      0.947      0.733
##       0.907      0.848     0.5275      0.936      0.923      0.945      0.724
##       0.903      0.843     0.5139      0.934      0.920      0.942      0.714
##       0.900      0.837     0.5007      0.931      0.917      0.940      0.705
##       0.896      0.831     0.4875      0.929      0.914      0.938      0.695
##       0.892      0.825     0.4742      0.926      0.911      0.936      0.686
##       0.888      0.819     0.4603      0.923      0.908      0.933      0.675
##       0.884      0.813     0.4464      0.920      0.904      0.931      0.665
##       0.880      0.806     0.4328      0.917      0.901      0.928      0.655
##       0.876      0.800     0.4193      0.914      0.897      0.926      0.644
##       0.871      0.793     0.4062      0.911      0.894      0.923      0.634
##       0.867      0.787     0.3934      0.908      0.890      0.920      0.624
##       0.863      0.780     0.3808      0.905      0.887      0.918      0.614
##       0.858      0.773     0.3684      0.902      0.883      0.915      0.603
##       0.854      0.767     0.3558      0.899      0.879      0.912      0.593
##       0.849      0.760     0.3437      0.896      0.875      0.909      0.583
##       0.845      0.753     0.3318      0.892      0.872      0.906      0.572
##       0.840      0.746     0.3202      0.889      0.868      0.904      0.562
##       0.836      0.739     0.3089      0.886      0.864      0.901      0.552
##       0.831      0.732     0.2978      0.883      0.860      0.898      0.542
##       0.826      0.725     0.2870      0.879      0.856      0.895      0.532
##       0.821      0.718     0.2761      0.876      0.852      0.892      0.522
##       0.816      0.711     0.2652      0.872      0.848      0.889      0.511
##       0.811      0.703     0.2545      0.868      0.843      0.885      0.500
##       0.801      0.688     0.2341      0.861      0.834      0.879      0.480
##       0.795      0.680     0.2237      0.857      0.830      0.875      0.469
##       0.790      0.672     0.2135      0.853      0.825      0.872      0.458
##       0.784      0.663     0.2028      0.848      0.820      0.868      0.446
##       0.777      0.654     0.1922      0.844      0.814      0.863      0.434
##       0.771      0.645     0.1817      0.839      0.809      0.859      0.422
##       0.764      0.636     0.1717      0.834      0.803      0.855      0.410
##       0.757      0.626     0.1621      0.829      0.797      0.850      0.398
##       0.750      0.617     0.1528      0.824      0.791      0.846      0.387
##       0.743      0.607     0.1436      0.819      0.785      0.841      0.375
##       0.736      0.597     0.1343      0.813      0.779      0.836      0.362
##       0.728      0.585     0.1248      0.807      0.772      0.831      0.349
##       0.719      0.574     0.1152      0.800      0.764      0.825      0.335
##       0.709      0.560     0.1051      0.793      0.755      0.818      0.320
##       0.698      0.547     0.0955      0.785      0.746      0.811      0.305
##       0.688      0.533     0.0867      0.777      0.737      0.804      0.290
##       0.678      0.519     0.0784      0.769      0.728      0.797      0.276
##       0.667      0.506     0.0708      0.761      0.719      0.790      0.262
##       0.656      0.492     0.0637      0.753      0.709      0.783      0.248
##       0.644      0.476     0.0560      0.743      0.698      0.774      0.233
##       0.630      0.459     0.0486      0.732      0.686      0.764      0.217
##       0.616      0.442     0.0419      0.721      0.673      0.754      0.201
##       0.599      0.422     0.0350      0.708      0.658      0.742      0.183
##       0.570      0.388     0.0252      0.684      0.632      0.721      0.155
##  survival34 survival35 survival36 survival37 survival38 survival39 survival40
##       0.999      0.994     0.9869     0.9890      0.999     0.9809      0.996
##       0.997      0.987     0.9739     0.9780      0.997     0.9620      0.991
##       0.995      0.981     0.9609     0.9670      0.996     0.9433      0.987
##       0.994      0.974     0.9481     0.9560      0.995     0.9248      0.983
##       0.992      0.968     0.9353     0.9452      0.993     0.9066      0.978
##       0.991      0.961     0.9225     0.9343      0.992     0.8885      0.974
##       0.989      0.954     0.9097     0.9233      0.991     0.8704      0.969
##       0.988      0.948     0.8969     0.9123      0.989     0.8526      0.965
##       0.986      0.941     0.8841     0.9014      0.988     0.8349      0.960
##       0.984      0.934     0.8714     0.8904      0.986     0.8173      0.956
##       0.983      0.928     0.8589     0.8796      0.985     0.8001      0.951
##       0.981      0.921     0.8464     0.8688      0.984     0.7831      0.946
##       0.980      0.914     0.8339     0.8580      0.982     0.7662      0.942
##       0.976      0.901     0.8090     0.8363      0.979     0.7329      0.932
##       0.974      0.894     0.7966     0.8256      0.978     0.7166      0.928
##       0.973      0.887     0.7838     0.8143      0.976     0.6998      0.923
##       0.971      0.880     0.7710     0.8031      0.975     0.6831      0.918
##       0.969      0.872     0.7583     0.7919      0.973     0.6666      0.913
##       0.967      0.865     0.7457     0.7808      0.971     0.6504      0.908
##       0.965      0.858     0.7332     0.7698      0.970     0.6346      0.903
##       0.963      0.851     0.7209     0.7588      0.968     0.6190      0.898
##       0.962      0.844     0.7086     0.7479      0.966     0.6035      0.893
##       0.960      0.837     0.6963     0.7369      0.965     0.5882      0.887
##       0.958      0.829     0.6840     0.7260      0.963     0.5731      0.882
##       0.956      0.822     0.6717     0.7150      0.961     0.5581      0.877
##       0.954      0.814     0.6591     0.7036      0.959     0.5428      0.871
##       0.952      0.807     0.6465     0.6923      0.958     0.5277      0.866
##       0.950      0.799     0.6341     0.6811      0.956     0.5129      0.860
##       0.947      0.791     0.6217     0.6698      0.954     0.4982      0.855
##       0.945      0.783     0.6091     0.6584      0.952     0.4835      0.849
##       0.943      0.775     0.5961     0.6465      0.950     0.4685      0.843
##       0.941      0.767     0.5833     0.6348      0.948     0.4538      0.837
##       0.938      0.758     0.5706     0.6231      0.946     0.4394      0.831
##       0.936      0.750     0.5582     0.6116      0.944     0.4255      0.825
##       0.933      0.742     0.5458     0.6002      0.942     0.4117      0.819
##       0.931      0.733     0.5332     0.5885      0.940     0.3978      0.813
##       0.928      0.724     0.5200     0.5762      0.937     0.3835      0.806
##       0.926      0.715     0.5067     0.5638      0.935     0.3692      0.799
##       0.923      0.706     0.4937     0.5515      0.932     0.3554      0.792
##       0.920      0.697     0.4807     0.5393      0.930     0.3418      0.785
##       0.917      0.688     0.4680     0.5272      0.927     0.3287      0.778
##       0.914      0.679     0.4555     0.5153      0.925     0.3158      0.771
##       0.912      0.670     0.4432     0.5035      0.922     0.3034      0.764
##       0.909      0.660     0.4310     0.4918      0.920     0.2912      0.757
##       0.906      0.651     0.4186     0.4799      0.917     0.2791      0.750
##       0.903      0.642     0.4065     0.4682      0.915     0.2674      0.743
##       0.900      0.632     0.3946     0.4566      0.912     0.2559      0.736
##       0.897      0.623     0.3830     0.4452      0.909     0.2449      0.729
##       0.894      0.614     0.3715     0.4340      0.906     0.2343      0.721
##       0.890      0.605     0.3602     0.4228      0.904     0.2239      0.714
##       0.887      0.595     0.3492     0.4119      0.901     0.2140      0.707
##       0.884      0.586     0.3380     0.4007      0.898     0.2040      0.699
##       0.881      0.576     0.3267     0.3894      0.895     0.1941      0.691
##       0.877      0.566     0.3156     0.3782      0.892     0.1845      0.683
##       0.870      0.547     0.2942     0.3564      0.886     0.1664      0.668
##       0.866      0.537     0.2831     0.3451      0.882     0.1573      0.659
##       0.862      0.526     0.2722     0.3338      0.879     0.1485      0.651
##       0.858      0.515     0.2607     0.3219      0.875     0.1394      0.642
##       0.854      0.504     0.2491     0.3098      0.871     0.1304      0.632
##       0.849      0.492     0.2376     0.2977      0.867     0.1217      0.622
##       0.845      0.481     0.2265     0.2859      0.863     0.1135      0.613
##       0.840      0.470     0.2157     0.2744      0.859     0.1056      0.603
##       0.835      0.458     0.2053     0.2632      0.855     0.0982      0.593
##       0.830      0.446     0.1948     0.2518      0.850     0.0909      0.583
##       0.825      0.434     0.1842     0.2401      0.845     0.0838      0.572
##       0.819      0.421     0.1731     0.2279      0.840     0.0765      0.561
##       0.813      0.407     0.1619     0.2154      0.835     0.0693      0.548
##       0.806      0.392     0.1497     0.2017      0.828     0.0618      0.534
##       0.798      0.377     0.1381     0.1884      0.822     0.0550      0.520
##       0.791      0.362     0.1273     0.1759      0.815     0.0488      0.507
##       0.783      0.347     0.1169     0.1638      0.808     0.0431      0.493
##       0.776      0.333     0.1073     0.1523      0.801     0.0380      0.479
##       0.768      0.318     0.0982     0.1413      0.794     0.0333      0.465
##       0.759      0.302     0.0880     0.1289      0.786     0.0284      0.449
##       0.748      0.285     0.0781     0.1166      0.777     0.0238      0.431
##       0.738      0.268     0.0690     0.1050      0.767     0.0199      0.414
##       0.725      0.248     0.0593     0.0924      0.756     0.0159      0.394
##       0.703      0.217     0.0450     0.0732      0.735     0.0106      0.359
##  survival41 survival42 survival43 survival44 survival45 survival46 survival47
##       0.993      0.992      0.997      0.997      0.997     0.9894      0.999
##       0.986      0.985      0.993      0.993      0.994     0.9788      0.998
##       0.978      0.977      0.990      0.990      0.991     0.9682      0.997
##       0.971      0.970      0.986      0.986      0.988     0.9577      0.996
##       0.964      0.962      0.983      0.983      0.985     0.9472      0.994
##       0.957      0.955      0.980      0.980      0.982     0.9367      0.993
##       0.949      0.947      0.976      0.976      0.979     0.9261      0.992
##       0.942      0.939      0.972      0.973      0.976     0.9155      0.991
##       0.935      0.932      0.969      0.969      0.972     0.9049      0.990
##       0.927      0.924      0.965      0.965      0.969     0.8943      0.988
##       0.920      0.916      0.962      0.962      0.966     0.8839      0.987
##       0.912      0.909      0.958      0.958      0.963     0.8734      0.986
##       0.905      0.901      0.955      0.955      0.960     0.8629      0.985
##       0.890      0.885      0.947      0.947      0.953     0.8420      0.982
##       0.883      0.878      0.943      0.944      0.950     0.8315      0.981
##       0.875      0.869      0.939      0.940      0.946     0.8207      0.980
##       0.867      0.861      0.936      0.936      0.943     0.8098      0.978
##       0.859      0.853      0.932      0.932      0.939     0.7989      0.977
##       0.851      0.845      0.928      0.928      0.936     0.7881      0.976
##       0.843      0.837      0.924      0.924      0.932     0.7774      0.974
##       0.835      0.829      0.920      0.920      0.928     0.7668      0.973
##       0.828      0.821      0.916      0.916      0.925     0.7561      0.971
##       0.820      0.812      0.911      0.912      0.921     0.7455      0.970
##       0.812      0.804      0.907      0.908      0.917     0.7348      0.968
##       0.804      0.796      0.903      0.903      0.914     0.7241      0.967
##       0.795      0.787      0.899      0.899      0.910     0.7130      0.965
##       0.787      0.778      0.894      0.895      0.906     0.7019      0.964
##       0.779      0.770      0.890      0.890      0.902     0.6910      0.962
##       0.770      0.761      0.885      0.886      0.898     0.6800      0.961
##       0.762      0.752      0.881      0.881      0.894     0.6688      0.959
##       0.753      0.743      0.876      0.876      0.889     0.6572      0.957
##       0.744      0.734      0.871      0.871      0.885     0.6457      0.956
##       0.735      0.725      0.866      0.866      0.880     0.6343      0.954
##       0.726      0.715      0.861      0.862      0.876     0.6231      0.952
##       0.717      0.706      0.856      0.857      0.872     0.6118      0.950
##       0.708      0.697      0.851      0.852      0.867     0.6003      0.948
##       0.698      0.687      0.846      0.846      0.862     0.5882      0.946
##       0.688      0.677      0.840      0.841      0.857     0.5760      0.944
##       0.679      0.667      0.835      0.835      0.852     0.5640      0.942
##       0.669      0.657      0.829      0.829      0.847     0.5519      0.940
##       0.659      0.647      0.823      0.824      0.842     0.5401      0.938
##       0.649      0.637      0.818      0.818      0.837     0.5283      0.936
##       0.639      0.627      0.812      0.812      0.831     0.5167      0.934
##       0.630      0.617      0.806      0.806      0.826     0.5051      0.932
##       0.620      0.607      0.800      0.800      0.821     0.4933      0.929
##       0.610      0.596      0.794      0.795      0.815     0.4818      0.927
##       0.600      0.586      0.788      0.789      0.810     0.4703      0.925
##       0.590      0.576      0.782      0.782      0.804     0.4589      0.922
##       0.580      0.566      0.776      0.776      0.799     0.4478      0.920
##       0.571      0.556      0.770      0.770      0.793     0.4367      0.918
##       0.561      0.547      0.764      0.764      0.788     0.4259      0.915
##       0.551      0.536      0.757      0.758      0.782     0.4147      0.913
##       0.541      0.526      0.751      0.751      0.776     0.4035      0.910
##       0.531      0.516      0.744      0.745      0.770     0.3923      0.907
##       0.511      0.495      0.731      0.731      0.758     0.3705      0.902
##       0.500      0.484      0.724      0.724      0.751     0.3591      0.899
##       0.489      0.474      0.716      0.717      0.744     0.3479      0.896
##       0.478      0.462      0.709      0.709      0.737     0.3359      0.893
##       0.466      0.450      0.700      0.701      0.730     0.3238      0.889
##       0.454      0.438      0.692      0.693      0.722     0.3116      0.886
##       0.442      0.426      0.684      0.684      0.714     0.2997      0.882
##       0.431      0.415      0.675      0.676      0.706     0.2881      0.879
##       0.419      0.403      0.667      0.667      0.698     0.2768      0.875
##       0.407      0.391      0.658      0.658      0.690     0.2652      0.871
##       0.395      0.379      0.648      0.649      0.681     0.2534      0.867
##       0.382      0.365      0.638      0.639      0.672     0.2410      0.863
##       0.368      0.351      0.627      0.628      0.662     0.2282      0.858
##       0.352      0.336      0.615      0.616      0.650     0.2142      0.852
##       0.337      0.321      0.602      0.603      0.638     0.2007      0.846
##       0.322      0.306      0.590      0.591      0.627     0.1878      0.841
##       0.308      0.292      0.577      0.578      0.615     0.1753      0.835
##       0.293      0.278      0.564      0.565      0.603     0.1635      0.829
##       0.279      0.264      0.552      0.553      0.591     0.1521      0.822
##       0.263      0.248      0.537      0.537      0.576     0.1392      0.815
##       0.246      0.231      0.520      0.521      0.561     0.1264      0.807
##       0.230      0.215      0.504      0.505      0.545     0.1142      0.798
##       0.212      0.197      0.485      0.486      0.527     0.1010      0.788
##       0.182      0.169      0.452      0.453      0.495     0.0808      0.770
##  survival48 survival49 survival50 survival51 survival52 survival53 survival54
##       0.998      0.995      0.991      0.992     0.9878     0.9884     0.9855
##       0.997      0.991      0.982      0.984     0.9756     0.9768     0.9711
##       0.995      0.986      0.973      0.976     0.9635     0.9653     0.9568
##       0.993      0.982      0.964      0.968     0.9515     0.9538     0.9426
##       0.991      0.977      0.955      0.959     0.9395     0.9424     0.9285
##       0.990      0.972      0.946      0.951     0.9275     0.9310     0.9144
##       0.988      0.968      0.937      0.943     0.9154     0.9195     0.9003
##       0.986      0.963      0.927      0.935     0.9034     0.9080     0.8863
##       0.984      0.958      0.918      0.927     0.8914     0.8965     0.8723
##       0.982      0.953      0.909      0.918     0.8795     0.8851     0.8584
##       0.980      0.949      0.900      0.910     0.8676     0.8737     0.8447
##       0.978      0.944      0.891      0.902     0.8558     0.8624     0.8311
##       0.977      0.939      0.882      0.894     0.8441     0.8512     0.8175
##       0.973      0.929      0.863      0.877     0.8205     0.8286     0.7905
##       0.971      0.924      0.854      0.869     0.8088     0.8174     0.7771
##       0.969      0.919      0.845      0.860     0.7967     0.8057     0.7633
##       0.967      0.914      0.835      0.851     0.7845     0.7940     0.7495
##       0.965      0.908      0.826      0.843     0.7724     0.7824     0.7357
##       0.962      0.903      0.816      0.834     0.7604     0.7708     0.7222
##       0.960      0.898      0.807      0.825     0.7486     0.7594     0.7088
##       0.958      0.893      0.797      0.817     0.7368     0.7480     0.6956
##       0.956      0.887      0.788      0.808     0.7250     0.7367     0.6824
##       0.954      0.882      0.778      0.799     0.7133     0.7253     0.6693
##       0.952      0.876      0.769      0.790     0.7015     0.7140     0.6562
##       0.949      0.871      0.759      0.782     0.6898     0.7026     0.6432
##       0.947      0.865      0.749      0.772     0.6777     0.6908     0.6297
##       0.945      0.859      0.739      0.763     0.6656     0.6791     0.6165
##       0.942      0.854      0.729      0.754     0.6537     0.6676     0.6034
##       0.940      0.848      0.719      0.745     0.6417     0.6560     0.5903
##       0.937      0.842      0.709      0.736     0.6296     0.6442     0.5770
##       0.935      0.836      0.699      0.726     0.6170     0.6319     0.5634
##       0.932      0.829      0.688      0.716     0.6047     0.6199     0.5500
##       0.929      0.823      0.678      0.706     0.5924     0.6079     0.5367
##       0.927      0.817      0.668      0.697     0.5803     0.5962     0.5238
##       0.924      0.810      0.657      0.687     0.5683     0.5844     0.5109
##       0.921      0.804      0.647      0.677     0.5560     0.5724     0.4978
##       0.918      0.797      0.636      0.667     0.5432     0.5598     0.4842
##       0.915      0.790      0.625      0.656     0.5302     0.5472     0.4705
##       0.912      0.783      0.613      0.646     0.5175     0.5346     0.4571
##       0.909      0.775      0.602      0.635     0.5048     0.5222     0.4438
##       0.906      0.768      0.591      0.625     0.4924     0.5099     0.4308
##       0.902      0.761      0.580      0.614     0.4800     0.4978     0.4180
##       0.899      0.754      0.569      0.604     0.4679     0.4858     0.4055
##       0.896      0.747      0.558      0.594     0.4559     0.4740     0.3932
##       0.892      0.739      0.547      0.583     0.4437     0.4619     0.3807
##       0.889      0.732      0.536      0.573     0.4317     0.4500     0.3685
##       0.886      0.724      0.525      0.562     0.4199     0.4383     0.3566
##       0.882      0.717      0.514      0.552     0.4083     0.4268     0.3449
##       0.879      0.709      0.504      0.542     0.3969     0.4155     0.3335
##       0.875      0.701      0.493      0.531     0.3856     0.4043     0.3223
##       0.872      0.694      0.483      0.521     0.3746     0.3933     0.3113
##       0.868      0.686      0.472      0.511     0.3634     0.3821     0.3003
##       0.864      0.678      0.461      0.500     0.3521     0.3707     0.2892
##       0.860      0.670      0.450      0.490     0.3409     0.3595     0.2783
##       0.852      0.654      0.429      0.469     0.3192     0.3378     0.2574
##       0.848      0.645      0.417      0.458     0.3080     0.3264     0.2467
##       0.844      0.636      0.406      0.447     0.2969     0.3153     0.2361
##       0.839      0.627      0.394      0.435     0.2851     0.3034     0.2251
##       0.834      0.617      0.382      0.423     0.2733     0.2914     0.2141
##       0.829      0.607      0.370      0.411     0.2615     0.2795     0.2031
##       0.824      0.597      0.358      0.399     0.2501     0.2679     0.1926
##       0.818      0.587      0.346      0.387     0.2390     0.2565     0.1825
##       0.813      0.577      0.334      0.375     0.2282     0.2455     0.1728
##       0.808      0.567      0.322      0.363     0.2173     0.2343     0.1630
##       0.802      0.556      0.310      0.351     0.2062     0.2229     0.1531
##       0.795      0.544      0.297      0.338     0.1946     0.2110     0.1430
##       0.788      0.531      0.283      0.324     0.1828     0.1988     0.1327
##       0.780      0.517      0.268      0.308     0.1699     0.1855     0.1217
##       0.772      0.503      0.254      0.294     0.1577     0.1727     0.1113
##       0.764      0.489      0.240      0.279     0.1461     0.1607     0.1017
##       0.756      0.475      0.226      0.265     0.1350     0.1490     0.0925
##       0.747      0.461      0.213      0.251     0.1246     0.1381     0.0841
##       0.738      0.447      0.200      0.238     0.1146     0.1276     0.0762
##       0.728      0.430      0.186      0.222     0.1035     0.1158     0.0675
##       0.717      0.413      0.171      0.206     0.0926     0.1042     0.0592
##       0.705      0.395      0.157      0.191     0.0825     0.0933     0.0515
##       0.691      0.375      0.141      0.174     0.0716     0.0816     0.0436
##       0.667      0.341      0.117      0.147     0.0553     0.0639     0.0321
##  survival55 survival56 survival57 survival58 survival59 survival60 survival61
##      0.9862     0.9897      0.995     0.9902     0.9835      0.992      0.996
##      0.9724     0.9794      0.990     0.9804     0.9672      0.984      0.992
##      0.9587     0.9691      0.985     0.9705     0.9509      0.976      0.989
##      0.9451     0.9588      0.980     0.9608     0.9349      0.968      0.985
##      0.9316     0.9486      0.975     0.9510     0.9190      0.959      0.981
##      0.9181     0.9384      0.970     0.9413     0.9031      0.951      0.977
##      0.9046     0.9281      0.965     0.9314     0.8873      0.943      0.973
##      0.8911     0.9177      0.959     0.9216     0.8715      0.935      0.969
##      0.8777     0.9074      0.954     0.9117     0.8559      0.927      0.965
##      0.8643     0.8971      0.949     0.9019     0.8404      0.918      0.961
##      0.8511     0.8869      0.944     0.8921     0.8251      0.910      0.957
##      0.8380     0.8767      0.938     0.8823     0.8099      0.902      0.953
##      0.8249     0.8665      0.933     0.8725     0.7949      0.894      0.949
##      0.7988     0.8461      0.922     0.8529     0.7650      0.877      0.941
##      0.7859     0.8359      0.917     0.8431     0.7503      0.869      0.936
##      0.7725     0.8252      0.911     0.8329     0.7351      0.860      0.932
##      0.7592     0.8146      0.905     0.8227     0.7199      0.851      0.928
##      0.7459     0.8040      0.900     0.8125     0.7050      0.843      0.923
##      0.7327     0.7934      0.894     0.8023     0.6902      0.834      0.919
##      0.7198     0.7829      0.888     0.7922     0.6756      0.825      0.914
##      0.7069     0.7725      0.883     0.7822     0.6613      0.817      0.910
##      0.6942     0.7621      0.877     0.7722     0.6470      0.808      0.905
##      0.6814     0.7516      0.871     0.7621     0.6329      0.799      0.901
##      0.6687     0.7412      0.865     0.7520     0.6188      0.790      0.896
##      0.6560     0.7307      0.859     0.7419     0.6049      0.782      0.892
##      0.6429     0.7198      0.853     0.7313     0.5905      0.772      0.887
##      0.6299     0.7089      0.847     0.7208     0.5763      0.763      0.882
##      0.6172     0.6982      0.840     0.7105     0.5624      0.754      0.877
##      0.6043     0.6874      0.834     0.7000     0.5485      0.745      0.872
##      0.5914     0.6764      0.828     0.6893     0.5345      0.736      0.867
##      0.5780     0.6650      0.821     0.6782     0.5201      0.726      0.861
##      0.5649     0.6537      0.814     0.6673     0.5060      0.716      0.856
##      0.5519     0.6425      0.807     0.6564     0.4922      0.707      0.851
##      0.5391     0.6314      0.800     0.6456     0.4787      0.697      0.845
##      0.5265     0.6203      0.794     0.6348     0.4653      0.687      0.840
##      0.5136     0.6090      0.787     0.6238     0.4517      0.677      0.834
##      0.5001     0.5971      0.779     0.6122     0.4377      0.667      0.828
##      0.4866     0.5850      0.771     0.6004     0.4236      0.656      0.822
##      0.4734     0.5731      0.764     0.5887     0.4099      0.646      0.816
##      0.4602     0.5612      0.756     0.5771     0.3963      0.635      0.809
##      0.4473     0.5495      0.748     0.5656     0.3831      0.625      0.803
##      0.4347     0.5379      0.741     0.5542     0.3702      0.615      0.797
##      0.4222     0.5264      0.733     0.5430     0.3576      0.604      0.791
##      0.4099     0.5149      0.725     0.5317     0.3452      0.594      0.784
##      0.3975     0.5032      0.717     0.5202     0.3327      0.583      0.778
##      0.3853     0.4917      0.709     0.5089     0.3207      0.573      0.771
##      0.3734     0.4803      0.701     0.4977     0.3088      0.562      0.765
##      0.3617     0.4691      0.693     0.4866     0.2974      0.552      0.758
##      0.3503     0.4580      0.685     0.4756     0.2862      0.542      0.751
##      0.3390     0.4470      0.677     0.4648     0.2752      0.531      0.745
##      0.3280     0.4362      0.669     0.4540     0.2646      0.521      0.738
##      0.3169     0.4251      0.661     0.4431     0.2540      0.511      0.731
##      0.3057     0.4139      0.652     0.4319     0.2433      0.500      0.724
##      0.2947     0.4027      0.644     0.4208     0.2329      0.490      0.717
##      0.2735     0.3810      0.627     0.3992     0.2131      0.469      0.703
##      0.2626     0.3696      0.618     0.3878     0.2030      0.458      0.695
##      0.2519     0.3583      0.608     0.3766     0.1931      0.447      0.687
##      0.2406     0.3463      0.599     0.3646     0.1829      0.435      0.678
##      0.2293     0.3342      0.588     0.3524     0.1727      0.423      0.670
##      0.2181     0.3220      0.578     0.3401     0.1627      0.411      0.661
##      0.2074     0.3100      0.567     0.3281     0.1531      0.399      0.651
##      0.1969     0.2984      0.557     0.3163     0.1440      0.387      0.642
##      0.1869     0.2869      0.546     0.3048     0.1353      0.375      0.633
##      0.1767     0.2753      0.536     0.2930     0.1266      0.363      0.624
##      0.1665     0.2633      0.524     0.2809     0.1179      0.351      0.614
##      0.1559     0.2508      0.512     0.2681     0.1090      0.338      0.603
##      0.1452     0.2379      0.499     0.2550     0.1002      0.324      0.591
##      0.1337     0.2237      0.484     0.2405     0.0908      0.309      0.578
##      0.1228     0.2099      0.470     0.2264     0.0820      0.294      0.565
##      0.1126     0.1969      0.455     0.2129     0.0740      0.279      0.552
##      0.1029     0.1841      0.441     0.1998     0.0664      0.265      0.538
##      0.0940     0.1720      0.427     0.1873     0.0596      0.251      0.525
##      0.0855     0.1603      0.412     0.1752     0.0532      0.238      0.512
##      0.0762     0.1472      0.395     0.1614     0.0464      0.222      0.496
##      0.0671     0.1339      0.378     0.1476     0.0399      0.206      0.479
##      0.0588     0.1214      0.360     0.1345     0.0341      0.191      0.462
##      0.0501     0.1078      0.340     0.1200     0.0282      0.174      0.443
##      0.0374     0.0867      0.306     0.0975     0.0199      0.147      0.409
##  survival62 survival63 survival64 survival65 survival66 survival67 survival68
##       0.996      0.998     0.9862      0.999      0.999     0.9852      0.998
##       0.993      0.997     0.9725      0.997      0.998     0.9705      0.996
##       0.989      0.995     0.9588      0.996      0.997     0.9559      0.994
##       0.985      0.993     0.9452      0.994      0.996     0.9414      0.992
##       0.981      0.992     0.9317      0.993      0.995     0.9271      0.990
##       0.978      0.990     0.9183      0.991      0.993     0.9127      0.988
##       0.974      0.988     0.9048      0.990      0.992     0.8983      0.986
##       0.970      0.987     0.8913      0.988      0.991     0.8840      0.984
##       0.966      0.985     0.8780      0.987      0.990     0.8698      0.982
##       0.962      0.983     0.8646      0.985      0.989     0.8557      0.979
##       0.958      0.981     0.8515      0.984      0.988     0.8417      0.977
##       0.954      0.980     0.8384      0.982      0.987     0.8279      0.975
##       0.950      0.978     0.8253      0.981      0.985     0.8141      0.973
##       0.942      0.974     0.7993      0.977      0.983     0.7866      0.968
##       0.938      0.972     0.7864      0.976      0.982     0.7730      0.966
##       0.934      0.970     0.7730      0.974      0.980     0.7589      0.964
##       0.929      0.969     0.7597      0.972      0.979     0.7449      0.961
##       0.925      0.967     0.7465      0.971      0.978     0.7310      0.959
##       0.921      0.965     0.7333      0.969      0.976     0.7173      0.957
##       0.916      0.963     0.7204      0.967      0.975     0.7037      0.954
##       0.912      0.961     0.7076      0.965      0.974     0.6903      0.952
##       0.908      0.959     0.6948      0.964      0.972     0.6770      0.949
##       0.903      0.956     0.6821      0.962      0.971     0.6637      0.947
##       0.899      0.954     0.6694      0.960      0.970     0.6504      0.944
##       0.894      0.952     0.6567      0.958      0.968     0.6373      0.942
##       0.889      0.950     0.6436      0.956      0.967     0.6237      0.939
##       0.884      0.948     0.6307      0.954      0.965     0.6102      0.936
##       0.880      0.946     0.6179      0.952      0.964     0.5970      0.934
##       0.875      0.943     0.6051      0.950      0.962     0.5838      0.931
##       0.870      0.941     0.5922      0.948      0.961     0.5704      0.928
##       0.864      0.938     0.5788      0.946      0.959     0.5566      0.925
##       0.859      0.936     0.5657      0.944      0.957     0.5431      0.922
##       0.854      0.933     0.5527      0.941      0.955     0.5298      0.919
##       0.849      0.931     0.5400      0.939      0.954     0.5167      0.916
##       0.843      0.928     0.5273      0.937      0.952     0.5037      0.913
##       0.838      0.926     0.5145      0.934      0.950     0.4906      0.909
##       0.832      0.923     0.5010      0.932      0.948     0.4769      0.906
##       0.826      0.920     0.4875      0.929      0.946     0.4631      0.902
##       0.820      0.917     0.4743      0.927      0.944     0.4496      0.899
##       0.814      0.914     0.4611      0.924      0.942     0.4363      0.895
##       0.808      0.911     0.4483      0.921      0.940     0.4233      0.892
##       0.801      0.908     0.4356      0.919      0.938     0.4105      0.888
##       0.795      0.905     0.4231      0.916      0.936     0.3979      0.884
##       0.789      0.902     0.4108      0.913      0.934     0.3855      0.881
##       0.783      0.898     0.3984      0.910      0.932     0.3730      0.877
##       0.776      0.895     0.3863      0.908      0.929     0.3609      0.873
##       0.770      0.892     0.3743      0.905      0.927     0.3489      0.869
##       0.763      0.889     0.3626      0.902      0.925     0.3373      0.865
##       0.757      0.885     0.3512      0.899      0.923     0.3259      0.861
##       0.750      0.882     0.3399      0.896      0.920     0.3147      0.857
##       0.744      0.879     0.3289      0.893      0.918     0.3038      0.853
##       0.737      0.875     0.3178      0.890      0.916     0.2928      0.849
##       0.730      0.871     0.3066      0.886      0.913     0.2818      0.845
##       0.723      0.868     0.2956      0.883      0.911     0.2709      0.840
##       0.709      0.860     0.2744      0.876      0.905     0.2502      0.831
##       0.701      0.856     0.2635      0.873      0.903     0.2395      0.826
##       0.693      0.852     0.2528      0.869      0.900     0.2291      0.822
##       0.685      0.848     0.2415      0.865      0.896     0.2182      0.816
##       0.676      0.843     0.2302      0.861      0.893     0.2073      0.811
##       0.667      0.838     0.2190      0.856      0.890     0.1965      0.805
##       0.658      0.833     0.2082      0.852      0.886     0.1861      0.799
##       0.649      0.828     0.1977      0.848      0.883     0.1761      0.793
##       0.640      0.823     0.1877      0.843      0.879     0.1665      0.787
##       0.631      0.818     0.1775      0.838      0.876     0.1569      0.781
##       0.621      0.812     0.1673      0.833      0.872     0.1472      0.774
##       0.610      0.806     0.1567      0.828      0.867     0.1372      0.767
##       0.599      0.799     0.1459      0.822      0.862     0.1272      0.760
##       0.586      0.792     0.1344      0.815      0.857     0.1164      0.751
##       0.573      0.784     0.1234      0.808      0.851     0.1063      0.742
##       0.560      0.776     0.1133      0.801      0.846     0.0969      0.733
##       0.546      0.768     0.1035      0.793      0.840     0.0880      0.723
##       0.533      0.760     0.0945      0.786      0.834     0.0799      0.714
##       0.520      0.752     0.0860      0.779      0.828     0.0722      0.704
##       0.504      0.742     0.0767      0.770      0.821     0.0638      0.693
##       0.488      0.731     0.0676      0.760      0.813     0.0558      0.680
##       0.471      0.720     0.0593      0.750      0.805     0.0484      0.668
##       0.451      0.706     0.0505      0.737      0.795     0.0408      0.653
##       0.418      0.683     0.0377      0.716      0.777     0.0298      0.626
##  survival69 survival70 survival71 survival72 survival73 survival74 survival75
##       0.991      0.995      0.996      0.995      0.997      0.992      0.994
##       0.982      0.990      0.992      0.989      0.994      0.985      0.987
##       0.973      0.985      0.989      0.984      0.991      0.977      0.981
##       0.965      0.981      0.985      0.978      0.988      0.970      0.975
##       0.956      0.976      0.981      0.973      0.985      0.962      0.968
##       0.947      0.971      0.977      0.967      0.982      0.955      0.962
##       0.938      0.966      0.973      0.962      0.979      0.947      0.956
##       0.929      0.961      0.969      0.956      0.976      0.940      0.949
##       0.920      0.956      0.965      0.950      0.973      0.932      0.943
##       0.911      0.951      0.961      0.945      0.970      0.924      0.936
##       0.902      0.946      0.957      0.939      0.966      0.917      0.929
##       0.893      0.941      0.953      0.933      0.963      0.909      0.923
##       0.884      0.936      0.949      0.928      0.960      0.901      0.916
##       0.866      0.925      0.941      0.916      0.954      0.886      0.903
##       0.857      0.920      0.937      0.910      0.950      0.878      0.896
##       0.848      0.915      0.932      0.904      0.947      0.870      0.890
##       0.839      0.909      0.928      0.898      0.943      0.862      0.882
##       0.829      0.903      0.923      0.892      0.940      0.853      0.875
##       0.820      0.898      0.919      0.886      0.936      0.845      0.868
##       0.811      0.892      0.914      0.880      0.933      0.837      0.861
##       0.801      0.887      0.910      0.874      0.929      0.829      0.854
##       0.792      0.881      0.905      0.867      0.926      0.821      0.847
##       0.783      0.876      0.901      0.861      0.922      0.813      0.840
##       0.773      0.870      0.896      0.855      0.918      0.805      0.833
##       0.764      0.864      0.892      0.849      0.915      0.796      0.826
##       0.754      0.858      0.887      0.842      0.911      0.788      0.818
##       0.745      0.852      0.882      0.835      0.907      0.779      0.811
##       0.735      0.846      0.877      0.829      0.903      0.770      0.803
##       0.725      0.840      0.872      0.822      0.899      0.762      0.796
##       0.715      0.834      0.867      0.815      0.895      0.753      0.788
##       0.705      0.827      0.861      0.808      0.890      0.744      0.780
##       0.694      0.821      0.856      0.801      0.886      0.734      0.772
##       0.684      0.814      0.851      0.793      0.882      0.725      0.764
##       0.674      0.807      0.845      0.786      0.877      0.716      0.756
##       0.664      0.801      0.840      0.779      0.873      0.707      0.747
##       0.654      0.794      0.834      0.771      0.868      0.698      0.739
##       0.643      0.787      0.828      0.764      0.863      0.688      0.730
##       0.631      0.779      0.822      0.755      0.858      0.678      0.721
##       0.620      0.772      0.816      0.747      0.853      0.668      0.712
##       0.609      0.764      0.810      0.739      0.848      0.657      0.703
##       0.598      0.757      0.803      0.731      0.843      0.647      0.694
##       0.588      0.749      0.797      0.723      0.838      0.637      0.685
##       0.577      0.742      0.791      0.715      0.833      0.628      0.676
##       0.566      0.734      0.785      0.707      0.828      0.618      0.667
##       0.555      0.727      0.778      0.698      0.822      0.607      0.658
##       0.544      0.719      0.771      0.690      0.817      0.597      0.649
##       0.533      0.711      0.765      0.681      0.812      0.587      0.640
##       0.522      0.703      0.758      0.673      0.806      0.577      0.630
##       0.512      0.695      0.752      0.665      0.801      0.567      0.621
##       0.501      0.688      0.745      0.656      0.795      0.557      0.612
##       0.491      0.680      0.738      0.648      0.790      0.547      0.603
##       0.480      0.672      0.731      0.639      0.784      0.537      0.594
##       0.469      0.663      0.724      0.630      0.778      0.527      0.584
##       0.458      0.655      0.717      0.621      0.772      0.517      0.574
##       0.437      0.638      0.703      0.604      0.760      0.496      0.555
##       0.426      0.629      0.695      0.594      0.753      0.485      0.545
##       0.415      0.620      0.687      0.585      0.747      0.475      0.535
##       0.403      0.611      0.679      0.574      0.740      0.463      0.524
##       0.391      0.601      0.670      0.564      0.732      0.451      0.513
##       0.378      0.590      0.661      0.553      0.724      0.439      0.501
##       0.366      0.580      0.652      0.542      0.717      0.427      0.490
##       0.354      0.570      0.643      0.531      0.709      0.416      0.478
##       0.343      0.559      0.634      0.520      0.701      0.404      0.467
##       0.331      0.549      0.624      0.509      0.693      0.392      0.456
##       0.318      0.538      0.614      0.498      0.684      0.380      0.443
##       0.305      0.525      0.603      0.485      0.675      0.366      0.430
##       0.292      0.513      0.592      0.472      0.665      0.353      0.417
##       0.277      0.498      0.578      0.457      0.653      0.337      0.401
##       0.262      0.484      0.565      0.442      0.641      0.322      0.386
##       0.248      0.470      0.552      0.427      0.630      0.307      0.371
##       0.234      0.455      0.539      0.413      0.618      0.293      0.356
##       0.221      0.441      0.525      0.398      0.606      0.279      0.342
##       0.208      0.427      0.512      0.384      0.594      0.265      0.328
##       0.193      0.410      0.496      0.367      0.580      0.249      0.311
##       0.178      0.392      0.479      0.349      0.564      0.232      0.294
##       0.164      0.375      0.463      0.332      0.549      0.216      0.277
##       0.148      0.355      0.443      0.312      0.530      0.198      0.257
##       0.123      0.321      0.409      0.278      0.499      0.169      0.225
##  survival76 survival77 survival78 survival79 survival80 survival81 survival82
##       0.997      0.998      0.993      0.990      0.998      0.996      0.991
##       0.994      0.995      0.987      0.981      0.996      0.992      0.982
##       0.990      0.993      0.980      0.971      0.994      0.988      0.974
##       0.987      0.990      0.973      0.961      0.991      0.984      0.965
##       0.984      0.988      0.967      0.952      0.989      0.980      0.956
##       0.981      0.985      0.960      0.942      0.987      0.975      0.947
##       0.977      0.982      0.953      0.932      0.985      0.971      0.939
##       0.974      0.980      0.946      0.923      0.982      0.967      0.930
##       0.971      0.977      0.939      0.913      0.980      0.963      0.921
##       0.967      0.975      0.932      0.903      0.978      0.958      0.912
##       0.964      0.972      0.926      0.894      0.975      0.954      0.903
##       0.960      0.969      0.919      0.884      0.973      0.950      0.894
##       0.957      0.967      0.912      0.874      0.971      0.946      0.885
##       0.950      0.961      0.898      0.855      0.966      0.937      0.868
##       0.946      0.958      0.891      0.845      0.964      0.932      0.859
##       0.943      0.956      0.884      0.835      0.961      0.928      0.849
##       0.939      0.953      0.876      0.825      0.958      0.923      0.840
##       0.935      0.950      0.869      0.815      0.956      0.918      0.831
##       0.931      0.947      0.861      0.805      0.953      0.914      0.821
##       0.927      0.944      0.854      0.795      0.951      0.909      0.812
##       0.924      0.941      0.847      0.785      0.948      0.904      0.803
##       0.920      0.938      0.839      0.775      0.945      0.899      0.794
##       0.916      0.935      0.832      0.765      0.943      0.894      0.785
##       0.912      0.932      0.824      0.755      0.940      0.890      0.775
##       0.908      0.928      0.817      0.745      0.937      0.885      0.766
##       0.904      0.925      0.809      0.735      0.934      0.879      0.756
##       0.899      0.922      0.801      0.724      0.931      0.874      0.747
##       0.895      0.919      0.793      0.714      0.928      0.869      0.737
##       0.891      0.915      0.785      0.704      0.925      0.864      0.727
##       0.887      0.912      0.777      0.693      0.922      0.858      0.717
##       0.882      0.908      0.769      0.682      0.919      0.853      0.707
##       0.877      0.904      0.760      0.671      0.916      0.847      0.697
##       0.873      0.901      0.752      0.661      0.912      0.841      0.687
##       0.868      0.897      0.744      0.650      0.909      0.836      0.677
##       0.863      0.893      0.735      0.639      0.906      0.830      0.667
##       0.858      0.889      0.726      0.628      0.902      0.824      0.656
##       0.853      0.885      0.717      0.617      0.899      0.818      0.645
##       0.848      0.881      0.708      0.605      0.895      0.811      0.634
##       0.842      0.877      0.699      0.593      0.891      0.805      0.623
##       0.837      0.872      0.689      0.582      0.887      0.798      0.612
##       0.832      0.868      0.680      0.570      0.883      0.791      0.601
##       0.826      0.864      0.671      0.559      0.880      0.785      0.590
##       0.821      0.859      0.661      0.548      0.876      0.778      0.580
##       0.815      0.855      0.652      0.537      0.872      0.772      0.569
##       0.809      0.850      0.642      0.525      0.867      0.765      0.558
##       0.804      0.845      0.633      0.514      0.863      0.758      0.547
##       0.798      0.841      0.623      0.503      0.859      0.751      0.536
##       0.792      0.836      0.614      0.492      0.855      0.744      0.526
##       0.786      0.831      0.605      0.481      0.851      0.737      0.515
##       0.780      0.827      0.595      0.470      0.846      0.730      0.505
##       0.774      0.822      0.586      0.459      0.842      0.723      0.494
##       0.768      0.817      0.576      0.449      0.838      0.716      0.484
##       0.762      0.812      0.566      0.437      0.833      0.709      0.473
##       0.756      0.806      0.557      0.426      0.828      0.701      0.462
##       0.743      0.796      0.537      0.405      0.819      0.686      0.441
##       0.736      0.790      0.527      0.393      0.814      0.678      0.429
##       0.729      0.784      0.516      0.382      0.809      0.670      0.418
##       0.721      0.778      0.505      0.370      0.803      0.661      0.406
##       0.713      0.772      0.494      0.358      0.797      0.652      0.394
##       0.705      0.765      0.482      0.346      0.791      0.642      0.382
##       0.697      0.758      0.470      0.334      0.785      0.633      0.370
##       0.689      0.751      0.459      0.322      0.778      0.623      0.358
##       0.681      0.744      0.447      0.310      0.772      0.614      0.346
##       0.672      0.737      0.436      0.298      0.766      0.604      0.334
##       0.663      0.729      0.423      0.286      0.759      0.594      0.322
##       0.653      0.721      0.410      0.273      0.751      0.583      0.309
##       0.643      0.712      0.396      0.260      0.743      0.571      0.295
##       0.630      0.702      0.381      0.246      0.733      0.557      0.280
##       0.618      0.691      0.366      0.231      0.724      0.543      0.265
##       0.606      0.681      0.351      0.218      0.714      0.530      0.251
##       0.594      0.670      0.336      0.205      0.704      0.516      0.237
##       0.582      0.659      0.322      0.192      0.695      0.503      0.224
##       0.569      0.649      0.307      0.180      0.685      0.489      0.211
##       0.554      0.636      0.291      0.166      0.673      0.473      0.196
##       0.538      0.622      0.274      0.152      0.660      0.456      0.181
##       0.522      0.607      0.257      0.139      0.646      0.439      0.167
##       0.503      0.590      0.238      0.124      0.630      0.419      0.151
##       0.471      0.561      0.207      0.101      0.603      0.385      0.125
##  survival83 survival84 survival85 survival86 survival87 survival88 survival89
##       0.997      0.992      0.992      0.991      0.995     0.9885      0.999
##       0.994      0.984      0.984      0.981      0.990     0.9770      0.999
##       0.991      0.976      0.976      0.972      0.984     0.9655      0.998
##       0.988      0.969      0.968      0.962      0.979     0.9542      0.997
##       0.985      0.961      0.960      0.953      0.974     0.9428      0.996
##       0.982      0.953      0.952      0.944      0.969     0.9315      0.996
##       0.979      0.945      0.944      0.934      0.963     0.9201      0.995
##       0.976      0.937      0.936      0.925      0.958     0.9086      0.994
##       0.973      0.929      0.928      0.915      0.953     0.8973      0.993
##       0.970      0.921      0.920      0.906      0.947     0.8859      0.993
##       0.966      0.913      0.912      0.897      0.942     0.8747      0.992
##       0.963      0.905      0.903      0.887      0.936     0.8634      0.991
##       0.960      0.897      0.895      0.878      0.931     0.8522      0.990
##       0.954      0.881      0.879      0.859      0.920     0.8298      0.989
##       0.950      0.872      0.871      0.849      0.914     0.8186      0.988
##       0.947      0.864      0.862      0.840      0.909     0.8070      0.987
##       0.943      0.856      0.854      0.830      0.903     0.7954      0.986
##       0.940      0.847      0.845      0.820      0.897     0.7839      0.985
##       0.936      0.839      0.836      0.810      0.891     0.7723      0.984
##       0.933      0.830      0.828      0.800      0.885     0.7610      0.983
##       0.929      0.822      0.819      0.791      0.879     0.7497      0.982
##       0.926      0.813      0.811      0.781      0.873     0.7384      0.981
##       0.922      0.805      0.802      0.771      0.867     0.7271      0.981
##       0.918      0.796      0.794      0.761      0.861     0.7158      0.980
##       0.915      0.788      0.785      0.752      0.855     0.7045      0.979
##       0.911      0.779      0.776      0.741      0.849     0.6928      0.978
##       0.907      0.770      0.767      0.731      0.842     0.6812      0.977
##       0.903      0.761      0.758      0.721      0.836     0.6697      0.976
##       0.899      0.752      0.749      0.711      0.829     0.6581      0.975
##       0.895      0.743      0.740      0.701      0.823     0.6464      0.973
##       0.890      0.733      0.730      0.690      0.816     0.6342      0.972
##       0.886      0.724      0.720      0.679      0.809     0.6222      0.971
##       0.882      0.714      0.711      0.669      0.802     0.6103      0.970
##       0.877      0.705      0.701      0.658      0.795     0.5986      0.969
##       0.873      0.695      0.692      0.648      0.788     0.5869      0.968
##       0.868      0.686      0.682      0.637      0.781     0.5749      0.966
##       0.864      0.675      0.672      0.625      0.773     0.5624      0.965
##       0.859      0.665      0.661      0.614      0.765     0.5497      0.964
##       0.854      0.655      0.651      0.603      0.758     0.5372      0.962
##       0.849      0.644      0.640      0.591      0.750     0.5248      0.961
##       0.843      0.634      0.630      0.580      0.742     0.5126      0.960
##       0.838      0.624      0.620      0.569      0.734     0.5005      0.958
##       0.833      0.614      0.609      0.558      0.726     0.4886      0.957
##       0.828      0.604      0.599      0.547      0.718     0.4767      0.955
##       0.823      0.593      0.589      0.535      0.710     0.4646      0.954
##       0.817      0.583      0.578      0.524      0.702     0.4528      0.952
##       0.812      0.572      0.568      0.513      0.694     0.4411      0.951
##       0.806      0.562      0.558      0.502      0.686     0.4296      0.949
##       0.801      0.552      0.547      0.491      0.677     0.4183      0.948
##       0.795      0.542      0.537      0.481      0.669     0.4071      0.946
##       0.790      0.532      0.527      0.470      0.661     0.3961      0.945
##       0.784      0.522      0.517      0.459      0.653     0.3849      0.943
##       0.778      0.511      0.506      0.448      0.644     0.3736      0.941
##       0.772      0.501      0.496      0.437      0.635     0.3624      0.939
##       0.760      0.480      0.475      0.416      0.618     0.3406      0.936
##       0.754      0.469      0.464      0.404      0.609     0.3293      0.934
##       0.747      0.458      0.453      0.393      0.599     0.3181      0.932
##       0.740      0.446      0.441      0.381      0.589     0.3062      0.930
##       0.732      0.434      0.429      0.369      0.579     0.2943      0.927
##       0.725      0.422      0.417      0.357      0.568     0.2823      0.925
##       0.717      0.410      0.405      0.345      0.558     0.2706      0.923
##       0.709      0.398      0.393      0.333      0.547     0.2593      0.920
##       0.701      0.387      0.382      0.321      0.536     0.2482      0.918
##       0.693      0.375      0.370      0.309      0.525     0.2370      0.915
##       0.684      0.362      0.357      0.297      0.514     0.2255      0.912
##       0.675      0.349      0.344      0.284      0.502     0.2136      0.909
##       0.665      0.335      0.330      0.271      0.489     0.2013      0.906
##       0.653      0.320      0.315      0.256      0.474     0.1880      0.902
##       0.642      0.305      0.300      0.242      0.459     0.1751      0.898
##       0.630      0.290      0.285      0.228      0.445     0.1630      0.894
##       0.618      0.276      0.271      0.214      0.430     0.1512      0.890
##       0.606      0.262      0.257      0.202      0.416     0.1402      0.886
##       0.594      0.248      0.244      0.189      0.401     0.1296      0.882
##       0.580      0.233      0.228      0.175      0.384     0.1178      0.876
##       0.565      0.217      0.212      0.161      0.367     0.1060      0.871
##       0.549      0.201      0.197      0.147      0.349     0.0951      0.865
##       0.531      0.184      0.179      0.132      0.329     0.0832      0.858
##       0.499      0.156      0.151      0.108      0.295     0.0652      0.845
##  survival90 survival91 survival92 survival93 survival94 survival95 survival96
##       0.992     0.9898      0.993      0.998     0.9881     0.9828     0.9879
##       0.983     0.9796      0.986      0.996     0.9761     0.9658     0.9759
##       0.975     0.9694      0.979      0.994     0.9642     0.9488     0.9638
##       0.967     0.9593      0.972      0.992     0.9525     0.9322     0.9519
##       0.958     0.9492      0.965      0.990     0.9407     0.9156     0.9400
##       0.950     0.9391      0.958      0.988     0.9290     0.8992     0.9281
##       0.942     0.9289      0.951      0.985     0.9171     0.8827     0.9162
##       0.933     0.9187      0.944      0.983     0.9053     0.8664     0.9042
##       0.925     0.9085      0.937      0.981     0.8936     0.8502     0.8923
##       0.916     0.8983      0.930      0.979     0.8818     0.8341     0.8805
##       0.908     0.8882      0.923      0.977     0.8702     0.8183     0.8687
##       0.899     0.8781      0.916      0.974     0.8586     0.8027     0.8570
##       0.891     0.8680      0.909      0.972     0.8470     0.7871     0.8453
##       0.874     0.8478      0.894      0.968     0.8239     0.7563     0.8219
##       0.865     0.8377      0.887      0.965     0.8124     0.7411     0.8103
##       0.856     0.8272      0.879      0.963     0.8005     0.7254     0.7983
##       0.848     0.8166      0.872      0.961     0.7885     0.7099     0.7862
##       0.839     0.8061      0.864      0.958     0.7766     0.6945     0.7742
##       0.830     0.7956      0.856      0.956     0.7648     0.6793     0.7622
##       0.821     0.7853      0.849      0.953     0.7531     0.6644     0.7505
##       0.812     0.7749      0.841      0.951     0.7415     0.6496     0.7388
##       0.803     0.7646      0.834      0.948     0.7299     0.6351     0.7271
##       0.794     0.7543      0.826      0.945     0.7183     0.6206     0.7154
##       0.785     0.7439      0.818      0.943     0.7068     0.6062     0.7037
##       0.776     0.7335      0.811      0.940     0.6952     0.5920     0.6921
##       0.767     0.7227      0.802      0.937     0.6832     0.5773     0.6800
##       0.758     0.7119      0.794      0.935     0.6713     0.5628     0.6680
##       0.748     0.7013      0.786      0.932     0.6595     0.5487     0.6562
##       0.739     0.6905      0.778      0.929     0.6477     0.5345     0.6442
##       0.729     0.6796      0.770      0.926     0.6357     0.5203     0.6322
##       0.719     0.6683      0.761      0.923     0.6233     0.5057     0.6197
##       0.710     0.6571      0.752      0.920     0.6110     0.4915     0.6073
##       0.700     0.6460      0.744      0.917     0.5989     0.4775     0.5951
##       0.690     0.6350      0.735      0.914     0.5870     0.4638     0.5831
##       0.680     0.6239      0.726      0.910     0.5750     0.4503     0.5711
##       0.670     0.6127      0.717      0.907     0.5629     0.4366     0.5589
##       0.660     0.6009      0.708      0.904     0.5502     0.4224     0.5461
##       0.649     0.5889      0.698      0.900     0.5373     0.4083     0.5332
##       0.638     0.5770      0.689      0.896     0.5247     0.3945     0.5205
##       0.627     0.5652      0.679      0.893     0.5121     0.3809     0.5079
##       0.617     0.5536      0.670      0.889     0.4997     0.3677     0.4954
##       0.606     0.5420      0.660      0.885     0.4875     0.3548     0.4832
##       0.596     0.5305      0.651      0.882     0.4754     0.3422     0.4711
##       0.585     0.5191      0.641      0.878     0.4634     0.3298     0.4590
##       0.575     0.5075      0.631      0.874     0.4513     0.3174     0.4468
##       0.564     0.4960      0.622      0.870     0.4394     0.3054     0.4349
##       0.553     0.4847      0.612      0.866     0.4276     0.2937     0.4231
##       0.543     0.4735      0.602      0.862     0.4160     0.2823     0.4115
##       0.533     0.4624      0.593      0.858     0.4046     0.2712     0.4002
##       0.522     0.4514      0.583      0.854     0.3934     0.2604     0.3889
##       0.512     0.4406      0.574      0.850     0.3824     0.2500     0.3779
##       0.502     0.4296      0.564      0.845     0.3712     0.2395     0.3667
##       0.491     0.4184      0.554      0.841     0.3598     0.2290     0.3553
##       0.480     0.4072      0.544      0.836     0.3486     0.2188     0.3441
##       0.459     0.3855      0.524      0.827     0.3269     0.1994     0.3224
##       0.448     0.3742      0.514      0.822     0.3156     0.1896     0.3112
##       0.437     0.3629      0.503      0.817     0.3045     0.1800     0.3001
##       0.425     0.3509      0.492      0.812     0.2927     0.1700     0.2883
##       0.413     0.3387      0.480      0.806     0.2808     0.1602     0.2765
##       0.401     0.3265      0.468      0.800     0.2690     0.1505     0.2647
##       0.389     0.3145      0.457      0.794     0.2575     0.1413     0.2532
##       0.377     0.3028      0.445      0.788     0.2463     0.1325     0.2420
##       0.365     0.2914      0.434      0.782     0.2354     0.1242     0.2312
##       0.353     0.2796      0.422      0.776     0.2243     0.1158     0.2202
##       0.341     0.2677      0.409      0.769     0.2131     0.1076     0.2091
##       0.328     0.2551      0.396      0.762     0.2014     0.0992     0.1974
##       0.314     0.2421      0.382      0.754     0.1894     0.0908     0.1855
##       0.299     0.2278      0.367      0.745     0.1764     0.0819     0.1726
##       0.284     0.2140      0.352      0.736     0.1639     0.0736     0.1602
##       0.269     0.2008      0.337      0.727     0.1521     0.0662     0.1486
##       0.255     0.1879      0.322      0.717     0.1407     0.0591     0.1374
##       0.242     0.1758      0.308      0.708     0.1301     0.0528     0.1269
##       0.228     0.1640      0.294      0.698     0.1199     0.0470     0.1168
##       0.213     0.1506      0.277      0.686     0.1086     0.0407     0.1056
##       0.197     0.1373      0.260      0.674     0.0973     0.0348     0.0946
##       0.182     0.1246      0.244      0.661     0.0869     0.0295     0.0843
##       0.166     0.1107      0.225      0.645     0.0757     0.0242     0.0733
##       0.139     0.0893      0.194      0.618     0.0588     0.0168     0.0568
##  survival97 survival98 survival99 survival100 survival101 survival102
##      0.9878      0.998     0.9846       0.992       0.998       0.994
##      0.9757      0.997     0.9694       0.983       0.995       0.988
##      0.9636      0.995     0.9542       0.975       0.993       0.982
##      0.9516      0.993     0.9392       0.967       0.991       0.977
##      0.9396      0.992     0.9243       0.959       0.988       0.971
##      0.9276      0.990     0.9095       0.950       0.986       0.965
##      0.9156      0.988     0.8946       0.942       0.984       0.959
##      0.9036      0.987     0.8798       0.934       0.981       0.953
##      0.8916      0.985     0.8651       0.925       0.979       0.947
##      0.8797      0.983     0.8505       0.917       0.976       0.941
##      0.8679      0.981     0.8361       0.909       0.974       0.935
##      0.8561      0.980     0.8218       0.900       0.972       0.929
##      0.8443      0.978     0.8076       0.892       0.969       0.923
##      0.8208      0.974     0.7793       0.875       0.964       0.910
##      0.8091      0.972     0.7653       0.866       0.961       0.904
##      0.7970      0.971     0.7509       0.858       0.959       0.898
##      0.7849      0.969     0.7365       0.849       0.956       0.891
##      0.7728      0.967     0.7222       0.840       0.953       0.885
##      0.7608      0.965     0.7081       0.831       0.950       0.878
##      0.7490      0.963     0.6942       0.822       0.948       0.871
##      0.7372      0.961     0.6804       0.813       0.945       0.865
##      0.7255      0.959     0.6668       0.805       0.942       0.858
##      0.7137      0.956     0.6532       0.796       0.939       0.852
##      0.7020      0.954     0.6397       0.787       0.936       0.845
##      0.6903      0.952     0.6263       0.778       0.933       0.838
##      0.6781      0.950     0.6124       0.769       0.930       0.831
##      0.6661      0.948     0.5987       0.760       0.927       0.824
##      0.6542      0.946     0.5852       0.750       0.924       0.817
##      0.6422      0.943     0.5717       0.741       0.921       0.810
##      0.6301      0.941     0.5581       0.731       0.918       0.803
##      0.6176      0.938     0.5441       0.722       0.914       0.795
##      0.6052      0.936     0.5304       0.712       0.911       0.787
##      0.5930      0.933     0.5169       0.702       0.907       0.780
##      0.5809      0.931     0.5037       0.692       0.904       0.772
##      0.5689      0.928     0.4905       0.683       0.901       0.765
##      0.5567      0.926     0.4772       0.673       0.897       0.757
##      0.5438      0.923     0.4634       0.662       0.893       0.748
##      0.5309      0.920     0.4495       0.651       0.889       0.740
##      0.5181      0.917     0.4359       0.641       0.885       0.731
##      0.5055      0.914     0.4225       0.630       0.881       0.723
##      0.4930      0.911     0.4094       0.620       0.877       0.714
##      0.4807      0.908     0.3965       0.609       0.873       0.706
##      0.4686      0.905     0.3840       0.599       0.869       0.697
##      0.4566      0.902     0.3716       0.588       0.864       0.689
##      0.4443      0.899     0.3590       0.577       0.860       0.680
##      0.4324      0.895     0.3469       0.567       0.856       0.671
##      0.4206      0.892     0.3350       0.556       0.851       0.662
##      0.4090      0.889     0.3233       0.546       0.847       0.653
##      0.3976      0.885     0.3120       0.536       0.843       0.645
##      0.3863      0.882     0.3009       0.525       0.838       0.636
##      0.3753      0.879     0.2901       0.515       0.834       0.627
##      0.3641      0.875     0.2792       0.505       0.829       0.618
##      0.3528      0.872     0.2683       0.494       0.824       0.609
##      0.3416      0.868     0.2576       0.483       0.819       0.600
##      0.3199      0.860     0.2371       0.462       0.809       0.581
##      0.3086      0.856     0.2266       0.451       0.804       0.571
##      0.2975      0.852     0.2164       0.440       0.798       0.562
##      0.2858      0.848     0.2057       0.428       0.792       0.551
##      0.2740      0.843     0.1950       0.416       0.786       0.540
##      0.2622      0.838     0.1844       0.404       0.780       0.529
##      0.2508      0.833     0.1744       0.392       0.773       0.518
##      0.2396      0.828     0.1646       0.380       0.767       0.507
##      0.2289      0.823     0.1553       0.368       0.760       0.496
##      0.2179      0.818     0.1460       0.356       0.754       0.484
##      0.2068      0.812     0.1367       0.344       0.746       0.472
##      0.1952      0.806     0.1271       0.331       0.738       0.460
##      0.1834      0.800     0.1174       0.317       0.730       0.446
##      0.1705      0.792     0.1071       0.302       0.720       0.431
##      0.1582      0.784     0.0975       0.287       0.710       0.416
##      0.1466      0.776     0.0885       0.273       0.700       0.401
##      0.1355      0.768     0.0801       0.258       0.690       0.386
##      0.1251      0.760     0.0724       0.245       0.680       0.372
##      0.1151      0.752     0.0652       0.231       0.669       0.357
##      0.1040      0.742     0.0574       0.216       0.657       0.340
##      0.0930      0.731     0.0499       0.200       0.643       0.323
##      0.0829      0.720     0.0431       0.185       0.630       0.306
##      0.0720      0.707     0.0360       0.168       0.613       0.286
##      0.0556      0.683     0.0260       0.141       0.585       0.253
##  survival103 survival104 survival105 survival106 survival107 survival108
##        0.997     0.97916       0.992       0.997       0.997       0.997
##        0.995     0.95859       0.984       0.994       0.993       0.994
##        0.992     0.93819       0.976       0.991       0.990       0.991
##        0.989     0.91819       0.968       0.988       0.986       0.988
##        0.986     0.89844       0.960       0.985       0.983       0.985
##        0.984     0.87886       0.952       0.982       0.980       0.982
##        0.981     0.85936       0.944       0.979       0.976       0.979
##        0.978     0.84009       0.936       0.976       0.973       0.976
##        0.975     0.82105       0.928       0.973       0.969       0.973
##        0.972     0.80223       0.920       0.969       0.965       0.970
##        0.969     0.78381       0.912       0.966       0.962       0.967
##        0.966     0.76561       0.904       0.963       0.958       0.964
##        0.963     0.74763       0.895       0.960       0.955       0.961
##        0.957     0.71221       0.879       0.953       0.947       0.954
##        0.954     0.69488       0.871       0.950       0.944       0.951
##        0.951     0.67707       0.862       0.946       0.940       0.948
##        0.948     0.65946       0.854       0.943       0.936       0.944
##        0.945     0.64213       0.845       0.939       0.932       0.941
##        0.941     0.62508       0.837       0.936       0.928       0.938
##        0.938     0.60845       0.828       0.932       0.924       0.934
##        0.935     0.59212       0.820       0.929       0.920       0.931
##        0.932     0.57603       0.811       0.925       0.916       0.927
##        0.928     0.56011       0.802       0.921       0.912       0.923
##        0.925     0.54438       0.794       0.918       0.908       0.920
##        0.921     0.52888       0.785       0.914       0.903       0.916
##        0.918     0.51297       0.776       0.910       0.899       0.912
##        0.914     0.49741       0.767       0.906       0.895       0.909
##        0.911     0.48225       0.758       0.902       0.890       0.905
##        0.907     0.46718       0.749       0.898       0.886       0.901
##        0.903     0.45213       0.740       0.894       0.881       0.897
##        0.899     0.43678       0.730       0.890       0.876       0.892
##        0.895     0.42187       0.721       0.885       0.872       0.888
##        0.891     0.40730       0.711       0.881       0.867       0.884
##        0.887     0.39317       0.702       0.877       0.862       0.880
##        0.883     0.37928       0.692       0.872       0.857       0.875
##        0.879     0.36536       0.682       0.868       0.852       0.871
##        0.874     0.35100       0.672       0.863       0.846       0.866
##        0.870     0.33677       0.661       0.858       0.841       0.861
##        0.865     0.32299       0.651       0.853       0.835       0.856
##        0.860     0.30954       0.641       0.847       0.830       0.851
##        0.855     0.29655       0.630       0.842       0.824       0.846
##        0.851     0.28394       0.620       0.837       0.818       0.841
##        0.846     0.27175       0.610       0.832       0.813       0.836
##        0.841     0.25987       0.599       0.827       0.807       0.831
##        0.836     0.24803       0.589       0.821       0.801       0.826
##        0.831     0.23669       0.579       0.816       0.795       0.820
##        0.826     0.22567       0.568       0.811       0.789       0.815
##        0.821     0.21509       0.558       0.805       0.783       0.810
##        0.816     0.20490       0.548       0.800       0.777       0.804
##        0.811     0.19503       0.538       0.794       0.771       0.799
##        0.805     0.18556       0.527       0.788       0.765       0.794
##        0.800     0.17613       0.517       0.783       0.758       0.788
##        0.795     0.16681       0.507       0.777       0.752       0.782
##        0.789     0.15781       0.496       0.771       0.745       0.776
##        0.778     0.14100       0.475       0.758       0.732       0.764
##        0.771     0.13258       0.464       0.752       0.725       0.758
##        0.765     0.12450       0.453       0.745       0.718       0.751
##        0.758     0.11618       0.442       0.738       0.710       0.744
##        0.751     0.10805       0.430       0.731       0.701       0.737
##        0.744     0.10018       0.417       0.723       0.693       0.729
##        0.737     0.09279       0.405       0.715       0.685       0.721
##        0.730     0.08583       0.394       0.707       0.676       0.714
##        0.722     0.07930       0.382       0.699       0.668       0.706
##        0.714     0.07288       0.370       0.691       0.659       0.698
##        0.706     0.06661       0.357       0.682       0.649       0.689
##        0.697     0.06033       0.344       0.673       0.639       0.680
##        0.688     0.05418       0.331       0.663       0.628       0.670
##        0.677     0.04782       0.315       0.651       0.616       0.659
##        0.666     0.04204       0.300       0.639       0.604       0.647
##        0.655     0.03690       0.286       0.628       0.591       0.636
##        0.643     0.03220       0.271       0.616       0.578       0.624
##        0.632     0.02807       0.257       0.604       0.566       0.612
##        0.620     0.02433       0.244       0.592       0.553       0.600
##        0.607     0.02044       0.228       0.578       0.538       0.586
##        0.592     0.01688       0.212       0.562       0.522       0.571
##        0.577     0.01383       0.197       0.547       0.506       0.556
##        0.559     0.01086       0.180       0.528       0.486       0.537
##        0.529     0.00698       0.152       0.496       0.453       0.506
##  survival109 survival110 survival111 survival112 survival113 survival114
##        0.999       0.999      0.9868       0.997       0.996       0.997
##        0.997       0.998      0.9736       0.994       0.991       0.995
##        0.996       0.996      0.9605       0.990       0.986       0.992
##        0.995       0.995      0.9475       0.987       0.982       0.989
##        0.994       0.994      0.9346       0.984       0.977       0.987
##        0.992       0.993      0.9216       0.981       0.973       0.984
##        0.991       0.991      0.9087       0.978       0.968       0.981
##        0.990       0.990      0.8957       0.974       0.963       0.978
##        0.988       0.989      0.8828       0.971       0.959       0.975
##        0.987       0.988      0.8700       0.967       0.954       0.973
##        0.985       0.986      0.8573       0.964       0.949       0.970
##        0.984       0.985      0.8447       0.961       0.944       0.967
##        0.983       0.984      0.8321       0.957       0.940       0.964
##        0.980       0.981      0.8070       0.950       0.930       0.958
##        0.978       0.980      0.7945       0.947       0.925       0.955
##        0.977       0.978      0.7816       0.943       0.920       0.952
##        0.975       0.977      0.7687       0.939       0.915       0.949
##        0.974       0.975      0.7558       0.936       0.910       0.946
##        0.972       0.974      0.7431       0.932       0.904       0.942
##        0.971       0.972      0.7305       0.928       0.899       0.939
##        0.969       0.971      0.7181       0.924       0.894       0.936
##        0.967       0.969      0.7057       0.921       0.889       0.933
##        0.966       0.968      0.6933       0.917       0.883       0.930
##        0.964       0.966      0.6809       0.913       0.878       0.926
##        0.962       0.964      0.6686       0.909       0.873       0.923
##        0.961       0.963      0.6558       0.905       0.867       0.919
##        0.959       0.961      0.6432       0.901       0.861       0.916
##        0.957       0.959      0.6307       0.896       0.856       0.912
##        0.955       0.958      0.6182       0.892       0.850       0.908
##        0.953       0.956      0.6055       0.888       0.844       0.905
##        0.951       0.954      0.5924       0.883       0.838       0.901
##        0.949       0.952      0.5796       0.879       0.831       0.897
##        0.947       0.950      0.5669       0.874       0.825       0.893
##        0.945       0.948      0.5543       0.869       0.819       0.889
##        0.943       0.946      0.5419       0.865       0.813       0.885
##        0.941       0.944      0.5292       0.860       0.806       0.881
##        0.939       0.942      0.5160       0.855       0.799       0.876
##        0.937       0.940      0.5027       0.849       0.792       0.872
##        0.934       0.938      0.4896       0.844       0.785       0.867
##        0.932       0.936      0.4766       0.839       0.778       0.863
##        0.930       0.933      0.4638       0.833       0.771       0.858
##        0.927       0.931      0.4513       0.828       0.764       0.853
##        0.925       0.929      0.4389       0.822       0.757       0.848
##        0.922       0.926      0.4267       0.817       0.750       0.844
##        0.920       0.924      0.4143       0.811       0.742       0.839
##        0.917       0.921      0.4022       0.806       0.735       0.834
##        0.914       0.919      0.3903       0.800       0.727       0.829
##        0.912       0.916      0.3786       0.794       0.720       0.824
##        0.909       0.914      0.3672       0.788       0.713       0.819
##        0.906       0.911      0.3559       0.783       0.705       0.814
##        0.904       0.909      0.3449       0.777       0.698       0.809
##        0.901       0.906      0.3337       0.771       0.690       0.803
##        0.898       0.903      0.3224       0.764       0.682       0.798
##        0.895       0.900      0.3113       0.758       0.674       0.792
##        0.889       0.895      0.2899       0.745       0.658       0.781
##        0.886       0.892      0.2789       0.738       0.649       0.775
##        0.882       0.888      0.2680       0.732       0.641       0.769
##        0.879       0.885      0.2565       0.724       0.631       0.762
##        0.875       0.881      0.2451       0.716       0.621       0.755
##        0.871       0.878      0.2336       0.708       0.611       0.748
##        0.867       0.874      0.2226       0.700       0.601       0.741
##        0.863       0.870      0.2119       0.692       0.592       0.734
##        0.859       0.866      0.2015       0.684       0.582       0.726
##        0.854       0.862      0.1911       0.675       0.571       0.719
##        0.850       0.857      0.1805       0.666       0.560       0.711
##        0.845       0.853      0.1696       0.656       0.549       0.702
##        0.839       0.847      0.1584       0.646       0.536       0.692
##        0.833       0.841      0.1464       0.634       0.522       0.682
##        0.827       0.835      0.1349       0.622       0.508       0.671
##        0.820       0.829      0.1243       0.610       0.494       0.660
##        0.813       0.823      0.1140       0.597       0.480       0.648
##        0.807       0.816      0.1045       0.585       0.466       0.637
##        0.800       0.810      0.0955       0.573       0.452       0.626
##        0.792       0.802      0.0856       0.558       0.435       0.612
##        0.782       0.793      0.0758       0.542       0.418       0.598
##        0.773       0.784      0.0669       0.526       0.400       0.583
##        0.762       0.773      0.0574       0.507       0.380       0.565
##        0.742       0.754      0.0434       0.475       0.346       0.535
##  survival115 survival116 survival117 survival118 survival119 survival120
##        0.995       0.992       0.998      0.9829       0.995       0.999
##        0.990       0.984       0.997      0.9659       0.990       0.997
##        0.985       0.976       0.995      0.9490       0.986       0.996
##        0.980       0.969       0.993      0.9324       0.981       0.994
##        0.975       0.961       0.992      0.9159       0.976       0.993
##        0.970       0.953       0.990      0.8994       0.971       0.992
##        0.965       0.945       0.988      0.8830       0.966       0.990
##        0.960       0.937       0.986      0.8667       0.961       0.989
##        0.955       0.929       0.985      0.8506       0.956       0.987
##        0.950       0.921       0.983      0.8346       0.951       0.986
##        0.944       0.913       0.981      0.8188       0.946       0.984
##        0.939       0.905       0.979      0.8032       0.941       0.983
##        0.934       0.897       0.977      0.7876       0.936       0.981
##        0.923       0.881       0.974      0.7569       0.926       0.978
##        0.918       0.873       0.972      0.7417       0.921       0.977
##        0.912       0.864       0.970      0.7261       0.916       0.975
##        0.907       0.856       0.968      0.7106       0.910       0.973
##        0.901       0.847       0.966      0.6952       0.905       0.972
##        0.895       0.839       0.964      0.6800       0.899       0.970
##        0.890       0.831       0.962      0.6651       0.894       0.968
##        0.884       0.822       0.960      0.6505       0.888       0.967
##        0.878       0.814       0.958      0.6359       0.883       0.965
##        0.873       0.805       0.956      0.6214       0.877       0.963
##        0.867       0.797       0.953      0.6071       0.872       0.961
##        0.861       0.788       0.951      0.5929       0.866       0.959
##        0.855       0.779       0.949      0.5782       0.860       0.958
##        0.849       0.770       0.947      0.5638       0.854       0.956
##        0.842       0.761       0.944      0.5496       0.848       0.954
##        0.836       0.752       0.942      0.5355       0.842       0.952
##        0.830       0.743       0.940      0.5213       0.836       0.950
##        0.823       0.734       0.937      0.5067       0.829       0.948
##        0.816       0.724       0.934      0.4925       0.823       0.945
##        0.810       0.715       0.932      0.4785       0.816       0.943
##        0.803       0.705       0.929      0.4648       0.810       0.941
##        0.796       0.696       0.927      0.4513       0.803       0.939
##        0.789       0.686       0.924      0.4377       0.796       0.937
##        0.782       0.676       0.921      0.4235       0.789       0.934
##        0.774       0.666       0.918      0.4093       0.782       0.932
##        0.767       0.655       0.915      0.3955       0.775       0.929
##        0.759       0.645       0.912      0.3820       0.767       0.927
##        0.751       0.635       0.909      0.3688       0.760       0.924
##        0.744       0.625       0.906      0.3558       0.752       0.921
##        0.736       0.614       0.903      0.3433       0.745       0.919
##        0.729       0.604       0.900      0.3309       0.737       0.916
##        0.721       0.594       0.896      0.3185       0.730       0.913
##        0.713       0.584       0.893      0.3065       0.722       0.911
##        0.705       0.573       0.890      0.2947       0.714       0.908
##        0.697       0.563       0.886      0.2833       0.707       0.905
##        0.689       0.553       0.883      0.2723       0.699       0.902
##        0.681       0.543       0.880      0.2614       0.691       0.899
##        0.673       0.533       0.876      0.2510       0.683       0.896
##        0.665       0.523       0.873      0.2405       0.675       0.893
##        0.656       0.512       0.869      0.2300       0.667       0.890
##        0.648       0.501       0.865      0.2197       0.659       0.887
##        0.631       0.481       0.857      0.2003       0.642       0.880
##        0.622       0.470       0.853      0.1905       0.633       0.877
##        0.613       0.459       0.849      0.1809       0.624       0.873
##        0.603       0.447       0.845      0.1709       0.615       0.869
##        0.593       0.435       0.840      0.1610       0.605       0.865
##        0.582       0.423       0.835      0.1513       0.594       0.861
##        0.572       0.411       0.830      0.1421       0.584       0.857
##        0.561       0.399       0.825      0.1333       0.574       0.853
##        0.551       0.388       0.820      0.1249       0.564       0.848
##        0.540       0.376       0.814      0.1166       0.553       0.844
##        0.529       0.363       0.808      0.1083       0.542       0.839
##        0.517       0.350       0.802      0.0998       0.530       0.833
##        0.504       0.336       0.795      0.0914       0.517       0.827
##        0.489       0.321       0.788      0.0825       0.503       0.821
##        0.475       0.306       0.780      0.0742       0.488       0.814
##        0.460       0.291       0.772      0.0667       0.474       0.807
##        0.446       0.277       0.764      0.0596       0.460       0.800
##        0.432       0.263       0.755      0.0533       0.446       0.793
##        0.417       0.249       0.747      0.0474       0.432       0.785
##        0.401       0.234       0.737      0.0411       0.415       0.777
##        0.383       0.217       0.726      0.0351       0.397       0.767
##        0.366       0.202       0.715      0.0298       0.380       0.757
##        0.345       0.184       0.701      0.0244       0.360       0.745
##        0.311       0.156       0.677      0.0170       0.325       0.724
##  survival121 survival122 survival123 survival124 survival125 survival126
##       0.9879      0.9881       0.998       0.997       0.993       0.998
##       0.9758      0.9761       0.996       0.995       0.985       0.996
##       0.9637      0.9642       0.994       0.992       0.977       0.995
##       0.9518      0.9525       0.992       0.990       0.970       0.993
##       0.9399      0.9407       0.990       0.987       0.963       0.991
##       0.9280      0.9290       0.988       0.984       0.955       0.989
##       0.9160      0.9171       0.986       0.981       0.947       0.987
##       0.9040      0.9053       0.984       0.979       0.940       0.986
##       0.8921      0.8936       0.982       0.976       0.932       0.984
##       0.8802      0.8818       0.980       0.973       0.924       0.982
##       0.8685      0.8702       0.978       0.970       0.917       0.980
##       0.8567      0.8586       0.976       0.968       0.909       0.978
##       0.8450      0.8470       0.974       0.965       0.901       0.976
##       0.8216      0.8239       0.969       0.959       0.886       0.972
##       0.8100      0.8124       0.967       0.956       0.878       0.970
##       0.7979      0.8004       0.965       0.953       0.870       0.968
##       0.7858      0.7885       0.963       0.950       0.862       0.966
##       0.7738      0.7766       0.960       0.947       0.854       0.964
##       0.7618      0.7648       0.958       0.944       0.846       0.962
##       0.7500      0.7531       0.956       0.941       0.838       0.959
##       0.7383      0.7415       0.953       0.937       0.829       0.957
##       0.7266      0.7299       0.951       0.934       0.821       0.955
##       0.7149      0.7183       0.948       0.931       0.813       0.953
##       0.7032      0.7068       0.946       0.928       0.805       0.951
##       0.6916      0.6952       0.943       0.924       0.797       0.948
##       0.6794      0.6832       0.941       0.921       0.788       0.946
##       0.6674      0.6713       0.938       0.917       0.779       0.943
##       0.6556      0.6595       0.935       0.914       0.771       0.941
##       0.6436      0.6477       0.933       0.910       0.762       0.939
##       0.6316      0.6357       0.930       0.907       0.753       0.936
##       0.6190      0.6233       0.927       0.903       0.744       0.933
##       0.6067      0.6110       0.924       0.899       0.735       0.931
##       0.5945      0.5989       0.921       0.895       0.726       0.928
##       0.5825      0.5870       0.918       0.891       0.717       0.925
##       0.5705      0.5750       0.915       0.887       0.708       0.922
##       0.5583      0.5629       0.912       0.883       0.698       0.920
##       0.5454      0.5502       0.909       0.879       0.688       0.916
##       0.5325      0.5373       0.905       0.874       0.678       0.913
##       0.5198      0.5247       0.902       0.870       0.668       0.910
##       0.5072      0.5121       0.898       0.865       0.658       0.907
##       0.4947      0.4997       0.895       0.861       0.648       0.904
##       0.4824      0.4875       0.891       0.856       0.638       0.900
##       0.4703      0.4754       0.888       0.852       0.628       0.897
##       0.4583      0.4634       0.884       0.847       0.618       0.894
##       0.4461      0.4512       0.880       0.842       0.608       0.890
##       0.4342      0.4394       0.876       0.837       0.598       0.887
##       0.4224      0.4276       0.873       0.832       0.588       0.883
##       0.4108      0.4160       0.869       0.827       0.578       0.880
##       0.3994      0.4046       0.865       0.822       0.568       0.876
##       0.3881      0.3934       0.861       0.817       0.558       0.873
##       0.3771      0.3824       0.857       0.812       0.548       0.869
##       0.3659      0.3712       0.853       0.807       0.538       0.865
##       0.3546      0.3598       0.849       0.802       0.528       0.861
##       0.3434      0.3486       0.845       0.796       0.517       0.857
##       0.3217      0.3269       0.836       0.785       0.497       0.849
##       0.3104      0.3156       0.831       0.779       0.486       0.845
##       0.2993      0.3045       0.826       0.773       0.475       0.841
##       0.2876      0.2927       0.821       0.767       0.464       0.836
##       0.2757      0.2808       0.816       0.760       0.452       0.831
##       0.2639      0.2690       0.810       0.753       0.440       0.826
##       0.2525      0.2575       0.804       0.746       0.428       0.820
##       0.2413      0.2462       0.799       0.739       0.416       0.815
##       0.2305      0.2354       0.793       0.732       0.405       0.810
##       0.2195      0.2243       0.787       0.724       0.393       0.804
##       0.2084      0.2131       0.780       0.716       0.380       0.798
##       0.1968      0.2014       0.773       0.707       0.367       0.791
##       0.1849      0.1894       0.766       0.698       0.353       0.784
##       0.1720      0.1763       0.757       0.687       0.338       0.776
##       0.1596      0.1639       0.748       0.677       0.323       0.768
##       0.1480      0.1521       0.739       0.666       0.308       0.760
##       0.1368      0.1407       0.730       0.655       0.293       0.751
##       0.1264      0.1301       0.721       0.644       0.279       0.743
##       0.1163      0.1199       0.712       0.632       0.265       0.734
##       0.1051      0.1086       0.700       0.619       0.250       0.723
##       0.0941      0.0973       0.688       0.605       0.233       0.712
##       0.0839      0.0869       0.676       0.590       0.217       0.700
##       0.0729      0.0757       0.661       0.573       0.199       0.686
##       0.0564      0.0588       0.635       0.542       0.170       0.661
##  survival127 survival128 survival129 survival130 survival131 survival132
##       0.9846       0.991       0.998       0.998       0.998      0.9858
##       0.9693       0.982       0.995       0.996       0.996      0.9718
##       0.9540       0.973       0.993       0.995       0.994      0.9577
##       0.9390       0.963       0.991       0.993       0.992      0.9439
##       0.9240       0.954       0.988       0.991       0.990      0.9301
##       0.9091       0.945       0.986       0.989       0.988      0.9163
##       0.8942       0.936       0.983       0.987       0.986      0.9025
##       0.8794       0.927       0.981       0.985       0.984      0.8887
##       0.8646       0.918       0.978       0.983       0.982      0.8751
##       0.8499       0.908       0.976       0.981       0.980      0.8614
##       0.8355       0.899       0.973       0.979       0.978      0.8480
##       0.8211       0.890       0.971       0.977       0.976      0.8346
##       0.8069       0.881       0.968       0.975       0.974      0.8213
##       0.7785       0.862       0.963       0.971       0.969      0.7947
##       0.7645       0.853       0.961       0.969       0.967      0.7816
##       0.7499       0.843       0.958       0.967       0.965      0.7680
##       0.7355       0.834       0.955       0.965       0.963      0.7544
##       0.7212       0.824       0.952       0.963       0.960      0.7409
##       0.7070       0.815       0.949       0.960       0.958      0.7276
##       0.6931       0.805       0.947       0.958       0.956      0.7144
##       0.6793       0.796       0.944       0.956       0.953      0.7014
##       0.6656       0.786       0.941       0.954       0.951      0.6884
##       0.6520       0.776       0.938       0.951       0.948      0.6755
##       0.6385       0.767       0.935       0.949       0.946      0.6626
##       0.6250       0.757       0.932       0.947       0.943      0.6497
##       0.6111       0.747       0.929       0.944       0.941      0.6364
##       0.5973       0.737       0.926       0.942       0.938      0.6233
##       0.5839       0.727       0.923       0.939       0.936      0.6104
##       0.5703       0.717       0.919       0.937       0.933      0.5974
##       0.5567       0.707       0.916       0.934       0.930      0.5843
##       0.5427       0.697       0.913       0.931       0.927      0.5708
##       0.5290       0.686       0.909       0.928       0.924      0.5575
##       0.5154       0.676       0.906       0.926       0.921      0.5444
##       0.5022       0.665       0.902       0.923       0.918      0.5316
##       0.4890       0.655       0.898       0.920       0.915      0.5188
##       0.4757       0.644       0.895       0.917       0.912      0.5058
##       0.4619       0.633       0.891       0.914       0.909      0.4923
##       0.4480       0.622       0.887       0.910       0.905      0.4787
##       0.4344       0.611       0.883       0.907       0.902      0.4653
##       0.4209       0.599       0.879       0.904       0.898      0.4521
##       0.4078       0.588       0.874       0.901       0.895      0.4392
##       0.3950       0.577       0.870       0.897       0.891      0.4265
##       0.3824       0.566       0.866       0.894       0.888      0.4140
##       0.3700       0.555       0.862       0.890       0.884      0.4016
##       0.3575       0.544       0.857       0.887       0.880      0.3892
##       0.3453       0.533       0.853       0.883       0.877      0.3770
##       0.3334       0.522       0.848       0.880       0.873      0.3650
##       0.3218       0.511       0.844       0.876       0.869      0.3534
##       0.3105       0.501       0.839       0.872       0.865      0.3420
##       0.2994       0.490       0.835       0.869       0.861      0.3307
##       0.2886       0.479       0.830       0.865       0.857      0.3198
##       0.2777       0.469       0.825       0.861       0.853      0.3087
##       0.2668       0.458       0.821       0.857       0.849      0.2975
##       0.2561       0.447       0.816       0.853       0.845      0.2865
##       0.2356       0.425       0.805       0.845       0.836      0.2655
##       0.2252       0.414       0.800       0.840       0.832      0.2547
##       0.2150       0.403       0.794       0.836       0.827      0.2441
##       0.2043       0.391       0.788       0.831       0.822      0.2329
##       0.1936       0.379       0.782       0.825       0.816      0.2217
##       0.1831       0.366       0.776       0.820       0.811      0.2107
##       0.1730       0.354       0.769       0.815       0.805      0.2000
##       0.1634       0.342       0.762       0.809       0.799      0.1897
##       0.1541       0.331       0.756       0.804       0.793      0.1798
##       0.1448       0.319       0.749       0.798       0.787      0.1698
##       0.1355       0.306       0.741       0.792       0.781      0.1598
##       0.1260       0.294       0.733       0.785       0.774      0.1495
##       0.1163       0.280       0.725       0.778       0.766      0.1390
##       0.1061       0.265       0.715       0.769       0.758      0.1277
##       0.0965       0.251       0.705       0.761       0.749      0.1170
##       0.0876       0.237       0.695       0.752       0.740      0.1071
##       0.0793       0.223       0.684       0.744       0.731      0.0977
##       0.0716       0.210       0.674       0.735       0.722      0.0890
##       0.0644       0.197       0.663       0.726       0.712      0.0808
##       0.0567       0.183       0.651       0.715       0.701      0.0718
##       0.0492       0.168       0.637       0.703       0.689      0.0631
##       0.0425       0.154       0.623       0.691       0.676      0.0552
##       0.0355       0.139       0.607       0.677       0.662      0.0468
##       0.0256       0.114       0.578       0.652       0.635      0.0347
##  survival133 survival134 survival135 survival136 survival137 survival138
##        0.995       0.997       0.995      0.9876       0.999       0.999
##        0.990       0.993       0.990      0.9753       0.998       0.997
##        0.984       0.990       0.985      0.9629       0.996       0.996
##        0.979       0.986       0.979      0.9507       0.995       0.995
##        0.974       0.983       0.974      0.9386       0.994       0.994
##        0.968       0.979       0.969      0.9264       0.993       0.992
##        0.963       0.976       0.964      0.9142       0.992       0.991
##        0.958       0.972       0.958      0.9020       0.990       0.990
##        0.952       0.969       0.953      0.8899       0.989       0.988
##        0.947       0.965       0.948      0.8777       0.988       0.987
##        0.941       0.961       0.942      0.8658       0.987       0.985
##        0.936       0.958       0.937      0.8538       0.985       0.984
##        0.930       0.954       0.932      0.8419       0.984       0.983
##        0.919       0.947       0.921      0.8180       0.981       0.980
##        0.913       0.943       0.915      0.8062       0.980       0.978
##        0.908       0.939       0.909      0.7939       0.979       0.977
##        0.902       0.935       0.904      0.7816       0.977       0.975
##        0.896       0.931       0.898      0.7694       0.976       0.974
##        0.890       0.927       0.892      0.7573       0.974       0.972
##        0.884       0.923       0.886      0.7453       0.973       0.971
##        0.878       0.919       0.880      0.7334       0.971       0.969
##        0.872       0.915       0.874      0.7215       0.970       0.967
##        0.866       0.910       0.868      0.7096       0.968       0.966
##        0.860       0.906       0.862      0.6978       0.967       0.964
##        0.853       0.902       0.856      0.6859       0.965       0.962
##        0.847       0.898       0.850      0.6737       0.963       0.961
##        0.841       0.893       0.844      0.6615       0.962       0.959
##        0.834       0.889       0.837      0.6495       0.960       0.957
##        0.827       0.884       0.831      0.6374       0.958       0.955
##        0.821       0.879       0.824      0.6252       0.957       0.953
##        0.814       0.874       0.817      0.6125       0.955       0.951
##        0.807       0.870       0.811      0.6000       0.953       0.949
##        0.800       0.865       0.804      0.5877       0.951       0.947
##        0.793       0.860       0.797      0.5755       0.949       0.945
##        0.786       0.855       0.790      0.5634       0.947       0.943
##        0.778       0.850       0.783      0.5511       0.945       0.941
##        0.771       0.844       0.775      0.5382       0.943       0.939
##        0.763       0.838       0.767      0.5252       0.941       0.937
##        0.755       0.833       0.760      0.5123       0.939       0.934
##        0.747       0.827       0.752      0.4996       0.937       0.932
##        0.739       0.821       0.744      0.4871       0.935       0.930
##        0.731       0.816       0.736      0.4747       0.932       0.927
##        0.723       0.810       0.728      0.4625       0.930       0.925
##        0.715       0.804       0.720      0.4505       0.928       0.922
##        0.707       0.798       0.712      0.4382       0.925       0.920
##        0.699       0.792       0.704      0.4262       0.923       0.917
##        0.690       0.786       0.696      0.4144       0.920       0.914
##        0.682       0.780       0.688      0.4028       0.918       0.912
##        0.674       0.774       0.680      0.3914       0.915       0.909
##        0.666       0.767       0.672      0.3801       0.913       0.906
##        0.658       0.761       0.664      0.3691       0.910       0.904
##        0.649       0.755       0.655      0.3579       0.908       0.901
##        0.640       0.748       0.647      0.3465       0.905       0.898
##        0.632       0.742       0.638      0.3353       0.902       0.895
##        0.614       0.728       0.621      0.3137       0.897       0.889
##        0.605       0.721       0.612      0.3025       0.894       0.886
##        0.595       0.714       0.602      0.2914       0.890       0.882
##        0.585       0.706       0.592      0.2797       0.887       0.879
##        0.575       0.697       0.582      0.2680       0.883       0.875
##        0.564       0.689       0.571      0.2563       0.880       0.871
##        0.553       0.680       0.561      0.2449       0.876       0.867
##        0.543       0.672       0.550      0.2339       0.872       0.863
##        0.532       0.663       0.540      0.2232       0.868       0.859
##        0.521       0.654       0.529      0.2123       0.864       0.854
##        0.510       0.645       0.517      0.2013       0.860       0.850
##        0.497       0.635       0.505      0.1898       0.855       0.845
##        0.484       0.624       0.492      0.1781       0.850       0.839
##        0.469       0.611       0.477      0.1654       0.844       0.833
##        0.455       0.599       0.462      0.1533       0.838       0.827
##        0.440       0.586       0.448      0.1419       0.832       0.820
##        0.425       0.573       0.433      0.1309       0.826       0.813
##        0.411       0.561       0.419      0.1207       0.819       0.807
##        0.397       0.548       0.405      0.1109       0.813       0.800
##        0.380       0.533       0.388      0.1000       0.805       0.792
##        0.362       0.516       0.370      0.0893       0.797       0.782
##        0.345       0.500       0.353      0.0794       0.788       0.773
##        0.325       0.481       0.333      0.0688       0.777       0.762
##        0.291       0.448       0.299      0.0530       0.758       0.742
##  survival139 survival140 survival141 survival142 survival143 survival144
##        0.999       0.995       0.998       0.998      0.9852       0.994
##        0.997       0.990       0.996       0.996      0.9705       0.989
##        0.996       0.984       0.994       0.995      0.9559       0.983
##        0.994       0.979       0.992       0.993      0.9414       0.978
##        0.993       0.974       0.990       0.991      0.9270       0.972
##        0.991       0.969       0.988       0.989      0.9127       0.966
##        0.990       0.964       0.986       0.987      0.8983       0.961
##        0.988       0.958       0.984       0.985      0.8840       0.955
##        0.987       0.953       0.981       0.983      0.8698       0.949
##        0.985       0.947       0.979       0.981      0.8557       0.943
##        0.984       0.942       0.977       0.979      0.8417       0.937
##        0.982       0.937       0.975       0.977      0.8279       0.931
##        0.981       0.931       0.973       0.975      0.8141       0.926
##        0.978       0.920       0.968       0.971      0.7866       0.914
##        0.976       0.915       0.966       0.969      0.7730       0.908
##        0.974       0.909       0.963       0.967      0.7589       0.902
##        0.973       0.903       0.961       0.965      0.7449       0.895
##        0.971       0.897       0.959       0.963      0.7310       0.889
##        0.969       0.891       0.956       0.961      0.7172       0.883
##        0.968       0.885       0.954       0.958      0.7037       0.876
##        0.966       0.880       0.951       0.956      0.6903       0.870
##        0.964       0.874       0.949       0.954      0.6770       0.864
##        0.962       0.868       0.946       0.952      0.6637       0.857
##        0.960       0.862       0.944       0.949      0.6504       0.851
##        0.959       0.856       0.941       0.947      0.6373       0.844
##        0.957       0.849       0.938       0.945      0.6236       0.837
##        0.955       0.843       0.936       0.942      0.6102       0.831
##        0.953       0.836       0.933       0.940      0.5970       0.824
##        0.951       0.830       0.930       0.937      0.5837       0.817
##        0.949       0.823       0.927       0.934      0.5704       0.810
##        0.946       0.816       0.924       0.932      0.5566       0.802
##        0.944       0.809       0.921       0.929      0.5431       0.795
##        0.942       0.803       0.918       0.926      0.5298       0.788
##        0.940       0.796       0.915       0.923      0.5167       0.780
##        0.938       0.789       0.912       0.920      0.5037       0.773
##        0.935       0.781       0.908       0.918      0.4906       0.765
##        0.933       0.774       0.905       0.914      0.4769       0.757
##        0.930       0.766       0.901       0.911      0.4631       0.749
##        0.928       0.758       0.898       0.908      0.4496       0.741
##        0.925       0.750       0.894       0.905      0.4363       0.732
##        0.922       0.743       0.890       0.901      0.4233       0.724
##        0.920       0.735       0.887       0.898      0.4104       0.716
##        0.917       0.727       0.883       0.895      0.3979       0.707
##        0.914       0.719       0.879       0.891      0.3855       0.699
##        0.912       0.711       0.875       0.888      0.3730       0.690
##        0.909       0.703       0.872       0.884      0.3609       0.682
##        0.906       0.694       0.868       0.881      0.3489       0.673
##        0.903       0.686       0.864       0.877      0.3372       0.665
##        0.900       0.678       0.860       0.873      0.3259       0.656
##        0.897       0.670       0.856       0.870      0.3147       0.648
##        0.894       0.662       0.852       0.866      0.3038       0.639
##        0.891       0.654       0.847       0.862      0.2928       0.630
##        0.888       0.645       0.843       0.858      0.2817       0.621
##        0.885       0.636       0.838       0.854      0.2709       0.612
##        0.878       0.619       0.830       0.846      0.2502       0.594
##        0.874       0.610       0.825       0.841      0.2395       0.584
##        0.871       0.600       0.820       0.837      0.2291       0.575
##        0.867       0.590       0.814       0.832      0.2181       0.564
##        0.863       0.580       0.809       0.827      0.2072       0.554
##        0.858       0.569       0.803       0.821      0.1964       0.542
##        0.854       0.559       0.797       0.816      0.1861       0.532
##        0.850       0.548       0.791       0.811      0.1761       0.521
##        0.845       0.538       0.785       0.805      0.1665       0.510
##        0.840       0.527       0.779       0.799      0.1569       0.498
##        0.835       0.515       0.772       0.793      0.1472       0.487
##        0.830       0.503       0.765       0.787      0.1372       0.474
##        0.824       0.490       0.757       0.779      0.1272       0.461
##        0.817       0.475       0.748       0.771      0.1164       0.446
##        0.810       0.460       0.739       0.763      0.1063       0.431
##        0.803       0.446       0.730       0.754      0.0969       0.416
##        0.796       0.431       0.720       0.745      0.0880       0.401
##        0.789       0.417       0.711       0.737      0.0799       0.387
##        0.781       0.402       0.701       0.728      0.0722       0.372
##        0.772       0.386       0.690       0.717      0.0638       0.356
##        0.763       0.368       0.677       0.705      0.0557       0.338
##        0.753       0.351       0.665       0.694      0.0484       0.321
##        0.741       0.330       0.649       0.679      0.0408       0.301
##        0.719       0.296       0.623       0.654      0.0298       0.267
##  survival145 survival146 survival147 survival148 survival149
##        0.997       0.995       0.991       0.990       0.998
##        0.995       0.989       0.981       0.981       0.997
##        0.992       0.984       0.972       0.971       0.995
##        0.989       0.978       0.963       0.962       0.994
##        0.986       0.973       0.953       0.952       0.992
##        0.983       0.968       0.944       0.943       0.990
##        0.981       0.962       0.935       0.933       0.989
##        0.978       0.956       0.925       0.924       0.987
##        0.975       0.951       0.916       0.914       0.985
##        0.972       0.945       0.906       0.904       0.983
##        0.969       0.940       0.897       0.895       0.982
##        0.966       0.934       0.888       0.885       0.980
##        0.963       0.928       0.878       0.876       0.978
##        0.957       0.917       0.859       0.857       0.975
##        0.954       0.911       0.850       0.847       0.973
##        0.951       0.905       0.840       0.837       0.971
##        0.948       0.899       0.830       0.827       0.969
##        0.944       0.893       0.821       0.817       0.967
##        0.941       0.887       0.811       0.807       0.965
##        0.938       0.881       0.801       0.797       0.963
##        0.934       0.875       0.791       0.787       0.961
##        0.931       0.868       0.782       0.777       0.959
##        0.928       0.862       0.772       0.768       0.957
##        0.924       0.856       0.762       0.758       0.955
##        0.921       0.850       0.753       0.748       0.953
##        0.917       0.843       0.742       0.737       0.950
##        0.914       0.836       0.732       0.727       0.948
##        0.910       0.830       0.722       0.717       0.946
##        0.906       0.823       0.712       0.707       0.944
##        0.902       0.816       0.702       0.696       0.941
##        0.898       0.809       0.691       0.685       0.939
##        0.894       0.802       0.680       0.674       0.936
##        0.890       0.795       0.670       0.664       0.934
##        0.886       0.788       0.659       0.653       0.931
##        0.882       0.780       0.649       0.642       0.929
##        0.878       0.773       0.638       0.632       0.926
##        0.873       0.765       0.627       0.620       0.923
##        0.869       0.757       0.615       0.609       0.921
##        0.864       0.749       0.604       0.597       0.918
##        0.859       0.741       0.593       0.586       0.915
##        0.855       0.733       0.581       0.574       0.912
##        0.850       0.725       0.570       0.563       0.909
##        0.845       0.717       0.559       0.552       0.906
##        0.840       0.708       0.548       0.541       0.903
##        0.835       0.700       0.537       0.529       0.899
##        0.830       0.692       0.526       0.518       0.896
##        0.825       0.683       0.515       0.507       0.893
##        0.820       0.675       0.504       0.496       0.890
##        0.815       0.667       0.493       0.485       0.886
##        0.809       0.658       0.482       0.474       0.883
##        0.804       0.650       0.472       0.464       0.880
##        0.799       0.641       0.461       0.453       0.876
##        0.793       0.632       0.450       0.442       0.873
##        0.788       0.623       0.439       0.431       0.869
##        0.776       0.606       0.417       0.409       0.862
##        0.770       0.596       0.406       0.398       0.857
##        0.764       0.587       0.395       0.386       0.853
##        0.757       0.576       0.383       0.374       0.849
##        0.750       0.566       0.370       0.362       0.844
##        0.743       0.555       0.358       0.350       0.839
##        0.735       0.544       0.346       0.338       0.835
##        0.728       0.534       0.334       0.326       0.830
##        0.721       0.523       0.323       0.314       0.825
##        0.713       0.512       0.311       0.303       0.819
##        0.705       0.500       0.299       0.290       0.814
##        0.696       0.488       0.286       0.278       0.808
##        0.686       0.474       0.272       0.264       0.801
##        0.675       0.459       0.257       0.250       0.793
##        0.664       0.444       0.243       0.235       0.786
##        0.653       0.430       0.229       0.222       0.778
##        0.641       0.415       0.216       0.208       0.770
##        0.630       0.401       0.203       0.196       0.762
##        0.618       0.386       0.190       0.183       0.754
##        0.605       0.370       0.176       0.169       0.744
##        0.590       0.352       0.162       0.155       0.733
##        0.575       0.334       0.148       0.142       0.722
##        0.557       0.314       0.133       0.127       0.709
##        0.526       0.281       0.109       0.104       0.685

10.4 GBM Classification modeling

  • Let’s run and score our classification GBM model:
require(gbm)

var <- paste(colnames(train_df_classificaiton)[ !(names(train_df_classificaiton) %in%
                            c("PatientID", 'SurvivalTime' , "ReachedEvent", "Event"))],
             collapse ="+")


classification_formula <- formula(paste('ReachedEvent ~ ', var))

classification_formula
## ReachedEvent ~ Histology + Mstage + Nstage + SourceDataset + 
##     Tstage + age + shape_Compactness1 + shape_Compactness2 + 
##     shape_Maximum3DDiameter + shape_SphericalDisproportion + 
##     shape_Sphericity + shape_SurfaceArea + shape_SurfaceVolumeRatio + 
##     shape_VoxelVolume + firstorder_Energy + firstorder_Entropy + 
##     firstorder_Kurtosis + firstorder_Maximum + firstorder_Mean + 
##     firstorder_MeanAbsoluteDeviation + firstorder_Median + firstorder_Minimum + 
##     firstorder_Range + firstorder_RootMeanSquared + firstorder_Skewness + 
##     firstorder_StandardDeviation + firstorder_Uniformity + firstorder_Variance + 
##     glcm_Autocorrelation + glcm_ClusterProminence + glcm_ClusterShade + 
##     glcm_ClusterTendency + glcm_Contrast + glcm_Correlation + 
##     glcm_DifferenceEntropy + glcm_DifferenceAverage + glcm_JointEnergy + 
##     glcm_JointEntropy + glcm_Id + glcm_Idm + glcm_Imc1 + glcm_Imc2 + 
##     glcm_Idmn + glcm_Idn + glcm_InverseVariance + glcm_MaximumProbability + 
##     glcm_SumAverage + glcm_SumEntropy + glrlm_ShortRunEmphasis + 
##     glrlm_LongRunEmphasis + glrlm_GrayLevelNonUniformity + glrlm_RunLengthNonUniformity + 
##     glrlm_RunPercentage + glrlm_LowGrayLevelRunEmphasis + glrlm_HighGrayLevelRunEmphasis + 
##     glrlm_ShortRunLowGrayLevelEmphasis + glrlm_ShortRunHighGrayLevelEmphasis + 
##     glrlm_LongRunLowGrayLevelEmphasis + glrlm_LongRunHighGrayLevelEmphasis
set.seed(1234)
gbm_model = gbm(classification_formula,
                data =  train_df_classificaiton,
                distribution='bernoulli',
                n.trees=2000,
                interaction.depth=2,
                shrinkage=0.001,
                bag.fraction=0.8,
                keep.data=FALSE,
                cv.folds=5)

nTrees_opt_cv <- gbm.perf(gbm_model, method = "cv")

validate_predictions <- predict(gbm_model,
                                newdata = validate_df_classification[,feature_names],
                                type="response", n.trees = nTrees_opt_cv )
print(gbm_model)
## gbm(formula = classification_formula, distribution = "bernoulli", 
##     data = train_df_classificaiton, n.trees = 2000, interaction.depth = 2, 
##     shrinkage = 0.001, bag.fraction = 0.8, cv.folds = 5, keep.data = FALSE)
## A gradient boosted model with bernoulli loss function.
## 2000 iterations were performed.
## The best cross-validation iteration was 1994.
## There were 59 predictors of which 55 had non-zero influence.
summary(gbm_model)

##                                                                     var
## glrlm_GrayLevelNonUniformity               glrlm_GrayLevelNonUniformity
## glcm_Idn                                                       glcm_Idn
## glcm_JointEntropy                                     glcm_JointEntropy
## age                                                                 age
## glcm_Imc2                                                     glcm_Imc2
## shape_Maximum3DDiameter                         shape_Maximum3DDiameter
## glcm_Idmn                                                     glcm_Idmn
## glrlm_LowGrayLevelRunEmphasis             glrlm_LowGrayLevelRunEmphasis
## glrlm_LongRunLowGrayLevelEmphasis     glrlm_LongRunLowGrayLevelEmphasis
## shape_SurfaceVolumeRatio                       shape_SurfaceVolumeRatio
## glrlm_ShortRunLowGrayLevelEmphasis   glrlm_ShortRunLowGrayLevelEmphasis
## glcm_Imc1                                                     glcm_Imc1
## firstorder_Mean                                         firstorder_Mean
## glcm_MaximumProbability                         glcm_MaximumProbability
## firstorder_Kurtosis                                 firstorder_Kurtosis
## glrlm_RunLengthNonUniformity               glrlm_RunLengthNonUniformity
## glcm_JointEnergy                                       glcm_JointEnergy
## firstorder_Skewness                                 firstorder_Skewness
## glcm_Id                                                         glcm_Id
## Nstage                                                           Nstage
## firstorder_Median                                     firstorder_Median
## glrlm_ShortRunHighGrayLevelEmphasis glrlm_ShortRunHighGrayLevelEmphasis
## glrlm_HighGrayLevelRunEmphasis           glrlm_HighGrayLevelRunEmphasis
## firstorder_Maximum                                   firstorder_Maximum
## shape_VoxelVolume                                     shape_VoxelVolume
## glcm_ClusterShade                                     glcm_ClusterShade
## firstorder_Range                                       firstorder_Range
## shape_Compactness1                                   shape_Compactness1
## glcm_DifferenceEntropy                           glcm_DifferenceEntropy
## firstorder_Energy                                     firstorder_Energy
## Histology                                                     Histology
## shape_SphericalDisproportion               shape_SphericalDisproportion
## glrlm_LongRunEmphasis                             glrlm_LongRunEmphasis
## glcm_ClusterProminence                           glcm_ClusterProminence
## glcm_InverseVariance                               glcm_InverseVariance
## glcm_Autocorrelation                               glcm_Autocorrelation
## glcm_Correlation                                       glcm_Correlation
## shape_SurfaceArea                                     shape_SurfaceArea
## glcm_Idm                                                       glcm_Idm
## firstorder_RootMeanSquared                   firstorder_RootMeanSquared
## glrlm_ShortRunEmphasis                           glrlm_ShortRunEmphasis
## glcm_DifferenceAverage                           glcm_DifferenceAverage
## glcm_SumAverage                                         glcm_SumAverage
## glcm_Contrast                                             glcm_Contrast
## firstorder_Uniformity                             firstorder_Uniformity
## Tstage                                                           Tstage
## SourceDataset                                             SourceDataset
## glrlm_RunPercentage                                 glrlm_RunPercentage
## glrlm_LongRunHighGrayLevelEmphasis   glrlm_LongRunHighGrayLevelEmphasis
## firstorder_Minimum                                   firstorder_Minimum
## firstorder_StandardDeviation               firstorder_StandardDeviation
## firstorder_Entropy                                   firstorder_Entropy
## glcm_SumEntropy                                         glcm_SumEntropy
## glcm_ClusterTendency                               glcm_ClusterTendency
## firstorder_MeanAbsoluteDeviation       firstorder_MeanAbsoluteDeviation
## Mstage                                                           Mstage
## shape_Compactness2                                   shape_Compactness2
## shape_Sphericity                                       shape_Sphericity
## firstorder_Variance                                 firstorder_Variance
##                                         rel.inf
## glrlm_GrayLevelNonUniformity        10.63462324
## glcm_Idn                            10.61853054
## glcm_JointEntropy                    7.34880093
## age                                  7.16466880
## glcm_Imc2                            7.09082785
## shape_Maximum3DDiameter              6.66863631
## glcm_Idmn                            4.94916942
## glrlm_LowGrayLevelRunEmphasis        4.61427258
## glrlm_LongRunLowGrayLevelEmphasis    3.66596813
## shape_SurfaceVolumeRatio             3.43734812
## glrlm_ShortRunLowGrayLevelEmphasis   3.26310024
## glcm_Imc1                            3.09482928
## firstorder_Mean                      2.60316846
## glcm_MaximumProbability              2.32348685
## firstorder_Kurtosis                  1.94839491
## glrlm_RunLengthNonUniformity         1.84132992
## glcm_JointEnergy                     1.67199203
## firstorder_Skewness                  1.59994874
## glcm_Id                              1.54478172
## Nstage                               1.30080560
## firstorder_Median                    1.27249826
## glrlm_ShortRunHighGrayLevelEmphasis  1.08582519
## glrlm_HighGrayLevelRunEmphasis       1.07074217
## firstorder_Maximum                   1.01513415
## shape_VoxelVolume                    0.97911330
## glcm_ClusterShade                    0.73995204
## firstorder_Range                     0.62164633
## shape_Compactness1                   0.47367222
## glcm_DifferenceEntropy               0.46761797
## firstorder_Energy                    0.42374941
## Histology                            0.41105799
## shape_SphericalDisproportion         0.41084501
## glrlm_LongRunEmphasis                0.40357621
## glcm_ClusterProminence               0.37689143
## glcm_InverseVariance                 0.29308109
## glcm_Autocorrelation                 0.28275583
## glcm_Correlation                     0.28113919
## shape_SurfaceArea                    0.25195121
## glcm_Idm                             0.20880008
## firstorder_RootMeanSquared           0.20251066
## glrlm_ShortRunEmphasis               0.18809456
## glcm_DifferenceAverage               0.18684395
## glcm_SumAverage                      0.17856441
## glcm_Contrast                        0.14104611
## firstorder_Uniformity                0.10881107
## Tstage                               0.10168454
## SourceDataset                        0.07619114
## glrlm_RunPercentage                  0.07289271
## glrlm_LongRunHighGrayLevelEmphasis   0.04992584
## firstorder_Minimum                   0.04921836
## firstorder_StandardDeviation         0.04621650
## firstorder_Entropy                   0.04590268
## glcm_SumEntropy                      0.03518750
## glcm_ClusterTendency                 0.03181835
## firstorder_MeanAbsoluteDeviation     0.03035885
## Mstage                               0.00000000
## shape_Compactness2                   0.00000000
## shape_Sphericity                     0.00000000
## firstorder_Variance                  0.00000000
require(pROC)
roc(response=validate_df_classification$ReachedEvent, predictor=validate_predictions)
## Setting levels: control = 0, case = 1
## Setting direction: controls < cases
## 
## Call:
## roc.default(response = validate_df_classification$ReachedEvent,     predictor = validate_predictions)
## 
## Data: validate_predictions in 86 controls (validate_df_classification$ReachedEvent 0) < 63 cases (validate_df_classification$ReachedEvent 1).
## Area under the curve: 0.7123
validate_predictions_bin <- as.factor(ifelse(validate_predictions  > 0.39,1,0))
validate_df_classification_event <- as.factor(validate_df_classification$Event)

caret::confusionMatrix(validate_predictions_bin, validate_df_classification_event)
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction  0  1
##          0 38 27
##          1 29 55
##                                           
##                Accuracy : 0.6242          
##                  95% CI : (0.5412, 0.7021)
##     No Information Rate : 0.5503          
##     P-Value [Acc > NIR] : 0.04122         
##                                           
##                   Kappa : 0.2385          
##                                           
##  Mcnemar's Test P-Value : 0.89369         
##                                           
##             Sensitivity : 0.5672          
##             Specificity : 0.6707          
##          Pos Pred Value : 0.5846          
##          Neg Pred Value : 0.6548          
##              Prevalence : 0.4497          
##          Detection Rate : 0.2550          
##    Detection Prevalence : 0.4362          
##       Balanced Accuracy : 0.6189          
##                                           
##        'Positive' Class : 0               
## 

10.4.1 GBM Classification Prediction

gbm_predictions_test <- predict(gbm_model, 
                                imputed_test[,feature_names],
                                n.trees = nTrees_opt_cv,
                                type = "response")
PatientID <- imputed_test$PatientID

pred <- data.frame(
  PatientID = PatientID,
  Event = gbm_predictions_test
)

submission_gbm <- output_test %>%
  select(PatientID, SurvivalTime) %>%
  left_join(pred, by = "PatientID")

fwrite(submission_gbm, "submission_gbm.csv")

#submission_bin_gbm <- submission_gbm %>%
#                     mutate(Event = ifelse(Event > 0.39, 1,0))
pred_bin_gbm <- as.factor(ifelse(pred > 0.39, 1, 0))
print("ratio of predicted Events in test dataset: "); table(pred_bin_gbm)
## [1] "ratio of predicted Events in test dataset: "
## pred_bin_gbm
##   0   1 
##  53 197
print("ratio of Events in train dataset: "); table(train$Event)
## [1] "ratio of Events in train dataset: "
## 
##   0   1 
## 138 162

Now that both models can predict the same period and the probability of reaching the event, we average them together and see how they help each other (straight 50/50 here which may not be the best mix)

10.5 Blend both models (Survival and Classification) together

roc(predictor = (validate_predictions + (1 - suvival_predictions$survival[,
                                                        which(suvival_predictions$unique.death.times==period_choice)]))/2,
          response = validate_df_classification$ReachedEvent)
## Setting levels: control = 0, case = 1
## Setting direction: controls < cases
## 
## Call:
## roc.default(response = validate_df_classification$ReachedEvent,     predictor = (validate_predictions + (1 - suvival_predictions$survival[,         which(suvival_predictions$unique.death.times == period_choice)]))/2)
## 
## Data: (validate_predictions + (1 - suvival_predictions$survival[, which(suvival_predictions$unique.death.times == period_choice)]))/2 in 86 controls (validate_df_classification$ReachedEvent 0) < 63 cases (validate_df_classification$ReachedEvent 1).
## Area under the curve: 0.7154