Human Generated Data

Title

Untitled (man donating blood, other men watching)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20253

Human Generated Data

Title

Untitled (man donating blood, other men watching)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 97.7
Human 97.7
Person 96.7
Person 96.1
Person 95.4
Person 95.3
Person 94.1
Sitting 94.1
Person 93.8
Apparel 89.3
Clothing 89.3
Coat 89.3
Overcoat 89.3
Suit 89.3
Plywood 87.5
Wood 87.5
Tie 83.2
Accessories 83.2
Accessory 83.2
Furniture 76.1
Tie 75.5
Restaurant 74.5
Suit 73.3
Suit 73.2
Indoors 69.1
Tie 66.7
Cafeteria 66
Shirt 62.9
Table 60.8
Room 59.3
Flooring 59
Face 57.3
Photography 57.3
Photo 57.3
Portrait 57.3
Couch 55.2

Imagga
created on 2022-03-05

businessman 50.4
meeting 50
executive 50
business 45.6
office 45.5
man 44.4
room 40
male 39.8
person 39.7
team 38.6
professional 38.6
group 37.9
table 37.3
people 35.2
businesswoman 34.6
businesspeople 33.2
classroom 33.1
teacher 32.8
corporate 31.8
teamwork 31.6
communication 26.9
sitting 26.7
colleagues 26.3
work 25.9
adult 25.9
laptop 25.7
men 24.9
working 23.9
desk 22.7
student 21.7
conference 21.5
discussion 21.4
together 21.1
smiling 21
talking 20.9
happy 20.7
computer 20.1
workplace 20
educator 19.8
confident 19.1
suit 19
worker 18.8
job 18.6
manager 17.7
indoors 17.6
successful 17.4
presentation 16.8
discussing 16.7
women 15.8
partners 15.6
indoor 15.5
entrepreneur 15.2
conversation 14.6
planning 14.5
career 14.2
modern 14
company 14
coworkers 13.8
cooperation 13.5
collar 13.4
handsome 13.4
cheerful 13
collaboration 12.8
associates 12.8
couple 12.2
education 12.1
smile 12.1
success 12.1
coffee 12.1
looking 12
technology 11.9
enrollee 11.8
explaining 11.8
board 11.8
document 11.8
busy 11.6
diversity 11.5
staff 11.5
happiness 11
lifestyle 10.9
employee 10.8
diverse 10.8
businessmen 10.7
center 10.7
corporation 10.6
ethnic 10.5
mature 10.2
seminar 9.8
colleague 9.8
leader 9.6
partnership 9.6
boss 9.6
chair 9.5
contemporary 9.4
senior 9.4
casual 9.3
interaction 8.9
interior 8.9
paperwork 8.8
partner 8.7
mid adult 8.7
paper 8.6
formal 8.6
adults 8.5
showing 8.5
portrait 8.4
hall 8.4
color 8.4
archive 8.3
restaurant 8.3
occupation 8.3
life 8.2
idea 8
debate 7.9
boardroom 7.9
associate 7.9
employment 7.7
class 7.7
30s 7.7
two 7.6
hand 7.6
females 7.6
plan 7.6
horizontal 7.5
smart 7.5
friends 7.5
holding 7.4
training 7.4
phone 7.4
director 7.3
to 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.2
clothing 93.1
table 90.4
man 88.1
text 82.3
black and white 62.3

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Calm 82.1%
Sad 11.3%
Happy 2.4%
Confused 2.1%
Surprised 1.3%
Disgusted 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Female, 80.4%
Calm 99.9%
Sad 0%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 99.3%
Sad 88.2%
Confused 5.6%
Calm 3.2%
Happy 1%
Disgusted 0.7%
Surprised 0.6%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 48-56
Gender Male, 51.1%
Calm 66.4%
Happy 23.3%
Sad 8.2%
Confused 0.7%
Surprised 0.5%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 94.8%
Sad 3.9%
Confused 0.5%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Male, 90%
Calm 79.5%
Sad 17%
Angry 1.1%
Happy 0.7%
Confused 0.7%
Disgusted 0.5%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 43-51
Gender Male, 99.3%
Calm 98.5%
Surprised 0.8%
Happy 0.3%
Disgusted 0.1%
Confused 0.1%
Sad 0.1%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Suit 89.3%
Tie 83.2%

Captions

Microsoft

a group of people sitting at a table 94.4%
a group of people sitting around a table 94.2%
a group of people sitting on a table 90.1%