Human Generated Data

Title

Untitled (group of elderly women on couches)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19802

Human Generated Data

Title

Untitled (group of elderly women on couches)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99
Person 99
Person 98.7
Indoors 98.3
Interior Design 98.3
Person 98.1
Person 97.6
Person 96.2
Chair 95.9
Furniture 95.9
Room 93.6
Person 93.5
Person 92.1
Person 90.4
Person 85.1
People 83.6
Living Room 82.8
Stage 79.9
Person 79.5
Crowd 73.1
Apparel 64.8
Suit 64.8
Coat 64.8
Overcoat 64.8
Clothing 64.8
Person 64.4
Female 62.3
Photo 61.2
Photography 61.2
Girl 60.6
Sitting 60.4
Face 60
Audience 58.7
Reception Room 55.7
Reception 55.7
Waiting Room 55.7
Person 48.9

Imagga
created on 2022-03-05

classroom 77.7
room 73.3
teacher 30.7
person 28.3
people 27.9
man 26.9
table 26.8
male 25.5
group 24.2
businessman 23.8
adult 22.5
business 22.5
interior 21.2
chair 21
men 19.7
professional 19.5
office 18.7
women 18.2
sitting 17.2
meeting 17
student 16.8
indoors 16.7
executive 15.8
education 15.6
indoor 15.5
corporate 15.5
businesswoman 15.4
modern 15.4
team 15.2
desk 15.1
work 14.9
happy 14.4
worker 13.5
communication 13.4
educator 13.3
talking 13.3
together 13.1
smiling 13
lifestyle 13
teamwork 13
home 12.8
musical instrument 12.5
job 12.4
learning 12.2
couple 12.2
conference 11.7
class 11.6
hall 11.3
study 11.2
school 11.2
holding 10.7
outfit 10.7
businesspeople 10.4
portrait 10.3
friends 10.3
restaurant 10.2
wind instrument 10.1
glass 10.1
house 10
musician 9.7
friendship 9.4
manager 9.3
confident 9.1
suit 9
teaching 8.8
boy 8.7
furniture 8.6
design 8.4
coffee 8.3
inside 8.3
music 8.2
board 8.1
cheerful 8.1
child 8.1
success 8
handsome 8
computer 8
employee 8
smile 7.8
students 7.8
concert 7.8
brass 7.7
studying 7.7
drinking 7.7
workplace 7.6
togetherness 7.5
silhouette 7.4
presentation 7.4
floor 7.4
phone 7.4
window 7.3
successful 7.3
laptop 7.3
looking 7.2
kid 7.1
happiness 7

Google
created on 2022-03-05

Dress 86.7
Chair 84.6
Table 82.1
Art 80
Couch 79
Font 74.8
Event 72.6
Monochrome 70.9
Monochrome photography 67.4
Painting 66.9
Room 65.1
Illustration 63.7
Sharing 61.9
Classic 60
Sitting 59.6
History 54.3
Visual arts 54.3
Vintage clothing 50.9

Microsoft
created on 2022-03-05

table 95.7
person 92.7
text 90
furniture 65.4
chair 62.8
people 58.4
group 58.1

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99%
Calm 89.9%
Sad 6.4%
Happy 1%
Angry 0.7%
Confused 0.6%
Fear 0.5%
Disgusted 0.5%
Surprised 0.3%

AWS Rekognition

Age 40-48
Gender Male, 96.7%
Calm 78.8%
Happy 9.4%
Confused 3.6%
Sad 3.5%
Angry 2%
Disgusted 1.4%
Surprised 0.7%
Fear 0.5%

AWS Rekognition

Age 48-56
Gender Female, 52.8%
Calm 87.2%
Happy 6.4%
Sad 3.8%
Confused 0.8%
Angry 0.6%
Disgusted 0.6%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 34-42
Gender Female, 73.1%
Calm 76%
Confused 8.2%
Sad 5.8%
Happy 3.7%
Surprised 2%
Disgusted 1.8%
Angry 1.5%
Fear 1.1%

AWS Rekognition

Age 34-42
Gender Male, 62.8%
Calm 92.6%
Sad 3.3%
Happy 1.5%
Confused 1.4%
Angry 0.5%
Disgusted 0.3%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 43-51
Gender Male, 74.2%
Fear 28.4%
Happy 22.2%
Calm 20.5%
Sad 17.4%
Disgusted 4%
Confused 3.4%
Angry 2.5%
Surprised 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 95.9%

Captions

Microsoft

a group of people playing instruments and performing on a stage 81.9%
a group of people standing in a room 81.8%
a group of people in a room 81.7%