Human Generated Data

Title

Untitled (man and children standing around large model train)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17822

Human Generated Data

Title

Untitled (man and children standing around large model train)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.5
Person 99.5
Person 99.5
Person 99.4
Person 99.3
Person 97.9
Person 96.5
Person 95.9
Person 95.8
Person 95
Person 89.9
Workshop 86.4
Building 85.2
Factory 73.2
Lab 72.1
Transportation 67
Train 67
Vehicle 67
Electronics 60.7
Screen 60.7
Person 56.6
Worker 56.6
Monitor 56.5
Display 56.5
LCD Screen 56.1

Imagga
created on 2022-02-26

room 36.9
classroom 33.6
man 32.2
people 30.1
male 29.8
table 27.3
person 27.1
musical instrument 26.4
women 24.5
sitting 24
percussion instrument 23.9
adult 23.8
marimba 23.2
smiling 23.1
work 22
indoors 22
happy 21.3
office 20.7
interior 19.4
businessman 19.4
group 19.3
home 19.1
teacher 18.9
business 18.8
hospital 18.2
meeting 16.9
professional 16.7
men 16.3
modern 16.1
teamwork 15.8
chair 15.5
worker 15.4
businesspeople 15.2
together 14.9
nurse 14.7
businesswoman 14.5
team 14.3
communication 13.4
house 13.4
talking 13.3
lifestyle 13
education 13
student 12.9
class 12.5
couple 12.2
smile 12.1
desk 11.7
colleagues 11.7
executive 11.4
corporate 11.2
two 11
happiness 11
working 10.6
computer 10.4
portrait 10.3
school 10.3
senior 10.3
mature 10.2
life 10
cheerful 9.8
clinic 9.7
exam 9.6
standing 9.6
adults 9.5
togetherness 9.4
friends 9.4
study 9.3
casual 9.3
indoor 9.1
confident 9.1
holding 9.1
brass 9
casual clothing 8.8
wind instrument 8.8
patient 8.8
learning 8.5
enjoyment 8.4
board 8.1
kitchen 8
job 8
living room 7.8
students 7.8
teaching 7.8
salon 7.8
glass 7.8
luxury 7.7
collar 7.7
drinking 7.7
workplace 7.6
friendship 7.5
phone 7.4
occupation 7.3
successful 7.3
laptop 7.3
color 7.2
stringed instrument 7.2
suit 7.2
looking 7.2
family 7.1
kid 7.1
to 7.1
decor 7.1
medical 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

indoor 96.2
person 87.9
table 84.9
man 68.5
clothing 63.4
gun 22.1
several 11.6

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 52%
Calm 98.8%
Sad 0.9%
Surprised 0.1%
Confused 0.1%
Fear 0%
Disgusted 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 31-41
Gender Male, 80.8%
Calm 94.7%
Sad 3.9%
Happy 0.4%
Confused 0.4%
Surprised 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 39-47
Gender Female, 82.8%
Calm 93.1%
Sad 5.7%
Happy 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 45-51
Gender Male, 98.6%
Calm 82.4%
Happy 11.5%
Sad 2%
Fear 1.1%
Confused 1%
Surprised 0.8%
Disgusted 0.7%
Angry 0.5%

AWS Rekognition

Age 37-45
Gender Male, 91.5%
Calm 37.8%
Happy 24.6%
Confused 15%
Sad 13.1%
Angry 3.4%
Surprised 3%
Disgusted 2%
Fear 1.1%

AWS Rekognition

Age 26-36
Gender Male, 65%
Sad 93.5%
Calm 5.8%
Happy 0.3%
Confused 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 51-59
Gender Male, 57%
Calm 99.3%
Sad 0.4%
Surprised 0.1%
Fear 0.1%
Angry 0%
Disgusted 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 33-41
Gender Male, 99.7%
Calm 99.2%
Sad 0.7%
Confused 0.1%
Happy 0%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Male, 97.8%
Calm 79.7%
Sad 6.8%
Confused 6.3%
Happy 2.7%
Disgusted 1.8%
Surprised 1%
Angry 1%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Train 67%

Captions

Microsoft

a group of people standing in a room 91.4%
a group of people in a room 91.3%
a group of people standing around a table 81.3%

Text analysis

Amazon

Land of Pueblos
Grandif
KODVK

Google

of
Pucblos
CE
Land of Pucblos Grand CE
Grand
Land