Human Generated Data

Title

Untitled (bride signing document, people watching)

Date

1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19487

Human Generated Data

Title

Untitled (bride signing document, people watching)

People

Artist: Samuel Cooper, American active 1950s

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19487

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Apparel 99.8
Clothing 99.8
Human 99.1
Person 99.1
Person 98.8
Person 98.7
Person 98.5
Person 98.1
Person 97
Person 91
Suit 89.7
Coat 89.7
Overcoat 89.7
Person 86.6
Footwear 75.2
Shoe 75.2
Indoors 75.1
Female 69.9
People 67.4
Room 65.5
Photography 64.2
Photo 64.2
Face 63.4
Cream 62.6
Food 62.6
Cake 62.6
Creme 62.6
Dessert 62.6
Icing 62.6
Dress 62.1
Fashion 60.7
Gown 60.7
Tuxedo 60.6
Furniture 60.5
Robe 58.6
Evening Dress 58.4
Clinic 57.7
Wedding 56.8
Person 55.6
Woman 55.5
Table 55.2

Clarifai
created on 2019-10-29

people 99.9
adult 98.6
group 98.6
group together 98.2
man 95.9
woman 95.3
medical practitioner 95.3
wear 93.1
veil 90.6
leader 89
several 88
uniform 85.3
administration 84.6
outfit 83.4
three 83.3
many 80.8
four 77.7
child 77
monochrome 76.4
five 76.3

Imagga
created on 2019-10-29

medical 33.6
man 32.9
people 32.9
nurse 32.8
person 30.3
doctor 30.1
patient 29.8
hospital 29.7
professional 28
male 26.2
adult 24
health 23.6
work 22
coat 21.1
surgeon 19.9
medicine 19.4
men 18.9
clinic 18.6
lab coat 18.2
happy 17.5
women 17.4
worker 16.6
barbershop 16.6
clothing 15.7
laboratory 15.4
shop 15.1
job 15
indoors 14.9
couple 14.8
smiling 14.5
business 14
office 14
scientist 13.7
lab 13.6
team 13.4
research 13.3
working 13.3
room 12.9
occupation 12.8
biology 12.3
portrait 12.3
care 11.5
exam 11.5
businessman 11.5
adults 11.4
sitting 11.2
student 11
two 11
smile 10.7
science 10.7
chemistry 10.6
uniform 10.6
test 10.6
color 10.6
life 10.5
mercantile establishment 10.5
instrument 10.4
day 10.2
lifestyle 10.1
groom 10
holding 9.9
modern 9.8
surgery 9.8
assistant 9.7
30s 9.6
profession 9.6
illness 9.5
sick person 9.4
case 9.4
happiness 9.4
senior 9.4
casual 9.3
teamwork 9.3
indoor 9.1
human 9
surgical 8.9
group 8.9
together 8.8
standing 8.7
education 8.7
mask 8.6
corporate 8.6
face 8.5
clothes 8.4
treatment 8.3
garment 8.2
looking 8
to 8
sword 7.9
microscope 7.9
doctors 7.9
50s 7.8
experiment 7.8
chemical 7.8
scientific 7.8
busy 7.7
angle 7.7
serious 7.6
desk 7.6
specialist 7.5
study 7.5
cheerful 7.3
bride 7.3
healing 7.3
new 7.3
dress 7.2
home 7.2
bright 7.1
place of business 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

person 99.8
clothing 93.5
black and white 77.6
woman 75.6
text 75.2
wedding dress 68.3
dress 67.3
sport 66.9
man 52

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 54.5%
Angry 45.1%
Happy 45%
Calm 53.2%
Fear 45%
Surprised 45%
Sad 46.5%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 50-68
Gender Male, 54.9%
Calm 54.1%
Surprised 45.7%
Confused 45%
Angry 45%
Fear 45%
Disgusted 45%
Happy 45.1%
Sad 45%

AWS Rekognition

Age 31-47
Gender Male, 51.4%
Angry 45.8%
Sad 51.3%
Disgusted 45%
Fear 45%
Calm 47.6%
Happy 45%
Confused 45.1%
Surprised 45.1%

AWS Rekognition

Age 36-54
Gender Female, 50.7%
Sad 54.5%
Fear 45.1%
Angry 45%
Disgusted 45%
Surprised 45%
Calm 45.2%
Confused 45.1%
Happy 45%

AWS Rekognition

Age 36-52
Gender Male, 51.7%
Angry 45.4%
Fear 45.1%
Sad 49.1%
Surprised 45.2%
Happy 45%
Calm 50.1%
Disgusted 45%
Confused 45.1%

Feature analysis

Amazon

Person 99.1%
Shoe 75.2%

Categories

Text analysis

Amazon

12.0061219