Human Generated Data

Title

Untitled (WWI veterans)

Date

1957

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19841

Human Generated Data

Title

Untitled (WWI veterans)

People

Artist: Ken Whitmire Associates, American

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 98.9
Person 98.9
Person 98.4
Person 97.5
Piano 96.2
Musical Instrument 96.2
Leisure Activities 96.2
Person 96.1
Clinic 85.8
Indoors 81.6
Interior Design 81.6
Furniture 80.1
Chair 80.1
Apparel 79.4
Clothing 79.4
Accessories 67.7
Accessory 67.7
Sunglasses 67.7
Room 63.9
Coat 62.8
People 61.2
Hospital 60.5
Person 56.1
Person 47.7
Person 47

Imagga
created on 2022-03-05

man 43
person 39.8
male 39.7
office 36
people 34.6
business 31.6
adult 29.8
businessman 29.1
professional 29
room 28.6
computer 25.7
patient 24.4
center 24.1
working 23
work 22.8
meeting 22.6
job 22.1
desk 21.9
group 21.8
corporate 21.5
surgeon 20.9
indoors 20.2
table 20.1
smiling 19.5
specialist 19.5
happy 19.4
businesswoman 19.1
men 18
teamwork 17.6
classroom 17.5
laptop 17.5
sitting 17.2
team 17
colleagues 16.5
education 16.5
lab coat 16.4
businesspeople 16.1
coat 15.7
worker 15.6
teacher 15.4
suit 15.4
executive 15.4
doctor 15
medical 14.1
manager 14
occupation 13.8
indoor 13.7
home 13.6
casual 13.6
couple 13.1
mature 13
portrait 12.9
technology 12.6
case 12.5
together 12.3
senior 12.2
hospital 12
modern 11.9
20s 11.9
horizontal 11.7
talking 11.4
student 11.4
clinic 11.3
looking 11.2
women 11.1
board 10.9
lifestyle 10.8
nurse 10.7
smile 10.7
interior 10.6
corporation 10.6
sick person 10.5
camera 10.2
confident 10
associates 9.8
handsome 9.8
conference 9.8
keyboard 9.4
company 9.3
coffee 9.3
successful 9.2
chair 9.1
monitor 9.1
health 9
coworkers 8.8
clothing 8.8
concentration 8.7
face 8.5
adults 8.5
two 8.5
treatment 8.3
school 8.1
science 8
to 8
medicine 7.9
building 7.9
collaboration 7.9
hall 7.9
40s 7.8
color 7.8
discussion 7.8
expertise 7.8
class 7.7
30s 7.7
attractive 7.7
shop 7.6
career 7.6
communication 7.6
engineer 7.5
holding 7.4
focus 7.4
alone 7.3
success 7.2
bright 7.2

Google
created on 2022-03-05

Coat 92.2
Black 89.6
Service 77.5
Cooking 75.4
Medical 72.9
Art 72.8
Machine 72.4
Motor vehicle 72.2
Event 70.5
Monochrome photography 67.9
Monochrome 66.7
Engineering 66.3
Factory 65.9
Vintage clothing 65.4
T-shirt 63.2
Science 63.2
History 63.2
Medical equipment 62.8
Job 61.4
Room 61.3

Microsoft
created on 2022-03-05

text 97.7
person 97.6
clothing 89.1
black and white 82.2
man 79.6

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 100%
Calm 99.8%
Confused 0.1%
Happy 0%
Disgusted 0%
Surprised 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Sad 98.2%
Calm 0.9%
Confused 0.7%
Angry 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 49-57
Gender Male, 99.8%
Calm 94.5%
Happy 1.9%
Confused 1.4%
Surprised 0.8%
Sad 0.7%
Disgusted 0.5%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 27-37
Gender Male, 99.1%
Happy 55.7%
Calm 38.5%
Sad 3.3%
Surprised 0.6%
Fear 0.6%
Angry 0.5%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Calm 91.7%
Happy 5%
Disgusted 1%
Sad 0.8%
Angry 0.5%
Confused 0.4%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 48-54
Gender Male, 100%
Calm 67.8%
Confused 29.5%
Sad 1.3%
Disgusted 0.8%
Angry 0.2%
Happy 0.2%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Piano 96.2%
Chair 80.1%
Sunglasses 67.7%

Captions

Microsoft

a group of people standing in front of a store 80.8%
a group of people standing in front of a building 80.7%
a group of people in front of a store 77.3%

Text analysis

Amazon

WAR
ONE
WORLD WAR ONE
JOIN
WORLD
VETERANS
JOIN VETERANS OF
6
GROUP
OF
I
STETSAN
I P
the
MJ17--YT37A°- -X
of
P

Google

GROUP
GROUP