Human Generated Data

Title

Untitled (nine women sitting on ledge)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19791

Human Generated Data

Title

Untitled (nine women sitting on ledge)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19791

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.7
Person 99.5
Person 99.5
Person 99.4
Person 99.3
Clothing 98.5
Apparel 98.5
Suit 95.5
Overcoat 95.5
Coat 95.5
People 88.8
Shoe 85.4
Footwear 85.4
Tuxedo 78.6
Dress 76
Shorts 75.5
Furniture 74.4
Shoe 72.9
Indoors 71.6
Blazer 68.1
Jacket 68.1
Crowd 67.9
Female 66.6
Floor 65.4
Table 64
Photography 63.5
Photo 63.5
Chair 60.2
Face 59.4
Shoe 58.6
Room 57.6
Lamp 56

Clarifai
created on 2023-10-22

people 99.8
group 98.2
group together 97.4
man 97.1
woman 96.6
many 95.8
adult 92.9
actor 91.9
leader 91.7
music 91.3
several 87.1
administration 86.8
education 86.7
dancing 86.3
indoors 84.1
five 80.5
child 79.8
actress 79.6
wear 79.2
monochrome 79.1

Imagga
created on 2022-03-05

people 34
group 31.4
business 31
man 30.2
businessman 30
women 29.3
male 29.1
men 28.3
person 25.6
corporate 24
adult 23.9
stage 21.8
office 19.4
meeting 18.8
team 18.8
professional 18
singer 17.9
job 17.7
work 17.3
silhouette 16.5
modern 16.1
musical instrument 16
wind instrument 15.9
teamwork 15.8
happy 15
musician 14.3
together 14
executive 13.8
platform 13.7
suit 13.6
life 13.2
performer 13.1
manager 13
success 12.9
crowd 12.5
interior 12.4
outfit 12.3
teacher 11.8
communication 11.8
chair 11.7
couple 11.3
human 11.2
portrait 11
businesswoman 10.9
room 10.9
smiling 10.8
worker 10.6
indoors 10.5
boss 10.5
urban 10.5
hall 10.1
employee 9.9
conference 9.8
black 9.7
diversity 9.6
boy 9.6
table 9.5
brass 9.5
ethnic 9.5
career 9.5
lifestyle 9.4
smile 9.3
city 9.1
building 9.1
entertainer 9.1
design 9
handsome 8.9
party 8.6
attractive 8.4
company 8.4
fashion 8.3
successful 8.2
indoor 8.2
girls 8.2
light 8
day 7.8
happiness 7.8
diverse 7.8
dance 7.8
education 7.8
sitting 7.7
two 7.6
walk 7.6
laptop 7.6
finance 7.6
businesspeople 7.6
elegance 7.6
friends 7.5
dancer 7.5
inside 7.4
clothing 7.1
harmonica 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 98.5
clothing 97.4
standing 96.5
posing 94.5
man 93
group 92.9
text 90.5
musical instrument 80
people 72.6
footwear 68.3
woman 60.8
clothes 28.7
line 19.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 97.5%
Sad 97.8%
Happy 1.1%
Fear 0.3%
Angry 0.2%
Confused 0.2%
Calm 0.2%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 51-59
Gender Male, 96.3%
Sad 86.4%
Calm 4.5%
Happy 2.7%
Confused 1.9%
Angry 1.8%
Disgusted 1.2%
Surprised 1%
Fear 0.5%

AWS Rekognition

Age 37-45
Gender Male, 54.8%
Happy 90.7%
Sad 3.1%
Surprised 2.5%
Confused 1.7%
Calm 1.1%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 99.7%
Happy 49.6%
Calm 39.1%
Disgusted 3.7%
Sad 3.6%
Confused 1.8%
Surprised 0.9%
Fear 0.7%
Angry 0.5%

AWS Rekognition

Age 48-54
Gender Female, 86.1%
Disgusted 32.5%
Sad 30.8%
Surprised 14.9%
Fear 6%
Happy 5.1%
Calm 5%
Confused 4.2%
Angry 1.5%

AWS Rekognition

Age 49-57
Gender Male, 89.5%
Calm 77.8%
Happy 7.4%
Confused 7%
Sad 3.6%
Surprised 1.5%
Disgusted 1.3%
Angry 1.1%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 97.5%
Calm 51%
Happy 45.4%
Sad 1.1%
Disgusted 0.8%
Confused 0.6%
Fear 0.4%
Surprised 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.8%
Person 99.7%
Person 99.7%
Person 99.7%
Person 99.5%
Person 99.5%
Person 99.4%
Person 99.3%
Shoe 85.4%
Shoe 72.9%
Shoe 58.6%