Human Generated Data

Title

Untitled (group of women, raising hands)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19786

Human Generated Data

Title

Untitled (group of women, raising hands)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19786

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.4
Person 99.4
Person 99.3
Chair 99
Furniture 99
Person 98.7
Person 98.1
Person 97.5
Person 95.7
Person 95.4
Person 94.8
Room 94.8
Indoors 94.8
Interior Design 94.4
Clothing 94
Apparel 94
Person 93.4
Person 92.3
Person 87.2
Crowd 85.1
Person 82.7
Person 79.9
People 78.9
Classroom 76.5
School 76.5
Audience 71.8
Female 71.4
Shoe 70.3
Footwear 70.3
Girl 63
Musician 62.9
Musical Instrument 62.9
Face 61.5
Suit 59.5
Coat 59.5
Overcoat 59.5
Shorts 57.4
Shoe 56.6

Clarifai
created on 2023-10-22

people 99.9
group 99.8
group together 99.2
woman 97.9
man 97.4
education 97.3
many 96.1
adult 95.7
musician 94.5
music 93.8
school 92.7
child 92.7
squad 91.1
teacher 90.7
actor 90.1
dancing 89.9
leader 89
dancer 86.6
rehearsal 86.4
singer 84.7

Imagga
created on 2022-03-05

brass 100
wind instrument 100
musical instrument 71.7
cornet 41.2
people 39.6
person 28.1
man 26.2
group 25.8
male 23.4
silhouette 23.2
men 21.4
adult 20.1
trombone 18.8
sport 18.1
baritone 17.6
crowd 16.3
businessman 15.9
women 15.8
lifestyle 15.2
happy 15
business 14.6
black 14.4
human 14.2
team 12.5
portrait 12.3
music 12.2
player 12.2
body 12
fun 12
professional 11.9
active 11.7
couple 11.3
success 11.3
training 11.1
competition 11
exercise 10.9
boy 10.4
athlete 10.4
teenager 10
leisure 10
attractive 9.8
audience 9.7
style 9.6
party 9.4
friendship 9.4
dark 9.2
fitness 9
design 9
posing 8.9
sexy 8.8
together 8.8
concert 8.7
match 8.7
love 8.7
happiness 8.6
dance 8.5
model 8.5
modern 8.4
field 8.4
event 8.3
fashion 8.3
sky 8.3
teen 8.3
looking 8
play 7.7
youth 7.7
outdoor 7.6
friends 7.5
lights 7.4
action 7.4
symbol 7.4
girls 7.3
student 7.2
copy space 7.2
sunset 7.2
recreation 7.2
activity 7.2
shadow 7.2
bright 7.1
smile 7.1

Google
created on 2022-03-05

Chair 85.8
Style 84
Black-and-white 83
Font 80.3
Event 74.4
Monochrome photography 71.8
Monochrome 71.1
Team 68.5
Crew 66.8
Room 65.9
Art 61.4
Music 61.3
Suit 61.2
Sitting 59.4
Fun 58.3
Crowd 57.8
T-shirt 57.1
Entertainment 50.9

Microsoft
created on 2022-03-05

person 99.9
text 93.6
clothing 93.2
posing 91.5
standing 84
woman 80.5
group 75.6
man 62.2
female 30.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.4%
Sad 76.5%
Confused 12%
Happy 4%
Surprised 3.6%
Disgusted 1.4%
Calm 1.4%
Angry 0.7%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Sad 96.4%
Happy 1.9%
Calm 0.8%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 71.1%
Happy 64.5%
Surprised 24.5%
Calm 5.8%
Sad 2.9%
Fear 0.7%
Confused 0.6%
Angry 0.5%
Disgusted 0.5%

AWS Rekognition

Age 48-54
Gender Female, 64%
Calm 95.7%
Sad 2.9%
Disgusted 0.5%
Confused 0.2%
Surprised 0.2%
Angry 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Male, 69.2%
Happy 94.5%
Calm 2.8%
Sad 1.4%
Confused 0.4%
Surprised 0.4%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 94.1%
Sad 69.3%
Calm 28%
Angry 0.7%
Happy 0.7%
Confused 0.6%
Fear 0.3%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 39-47
Gender Male, 94.6%
Happy 53.7%
Calm 31.3%
Disgusted 5.2%
Fear 3%
Surprised 2.7%
Sad 2.1%
Confused 1.2%
Angry 0.7%

AWS Rekognition

Age 45-53
Gender Female, 52.7%
Calm 97.6%
Happy 1.9%
Sad 0.3%
Disgusted 0.1%
Confused 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Female, 99.3%
Calm 99.6%
Happy 0.3%
Sad 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 77.6%
Calm 48.4%
Happy 48%
Sad 1%
Disgusted 0.8%
Confused 0.7%
Surprised 0.6%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Female, 96.2%
Calm 98.8%
Happy 0.5%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%
Sad 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 53-61
Gender Female, 99.5%
Sad 58.8%
Calm 38%
Happy 1.5%
Confused 0.9%
Surprised 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 95.6%
Calm 53.8%
Happy 16.6%
Sad 9.3%
Disgusted 7.3%
Confused 7.3%
Angry 2.8%
Fear 1.7%
Surprised 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Shoe
Person 99.6%
Person 99.4%
Person 99.4%
Person 99.3%
Person 98.7%
Person 98.1%
Person 97.5%
Person 95.7%
Person 95.4%
Person 94.8%
Person 93.4%
Person 92.3%
Person 87.2%
Person 82.7%
Person 79.9%
Chair 99%
Shoe 70.3%
Shoe 56.6%