Human Generated Data

Title

Untitled (large group of people at outside event)

Date

1957

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19835

Human Generated Data

Title

Untitled (large group of people at outside event)

People

Artist: Ken Whitmire Associates, American

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19835

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.7
Apparel 99.7
Person 99.3
Human 99.3
Person 99.2
Person 99.2
Person 99.2
Person 99.2
Person 98.9
Person 98.4
Crowd 96
Dress 92.4
Person 91.7
Person 91.5
Person 90.3
Person 90.1
Female 89.5
Person 88.1
Face 84
Audience 83.7
Person 79.8
People 75.6
Woman 72.9
Parade 68.4
Girl 65.9
Person 65.2
Coat 64.9
Robe 64.6
Fashion 64.6
Person 61
Suit 60.9
Overcoat 60.9
Person 60.2
Tree 60.2
Plant 60.2
Costume 60
Festival 59.6
Gown 59.3
Photography 57.9
Photo 57.9
Person 43.2

Clarifai
created on 2023-10-22

people 100
many 99.7
group 99.5
crowd 98.9
adult 97.5
woman 97.5
group together 97.4
man 97.2
administration 95.9
leader 93
music 92.7
dancing 88.9
recreation 88.8
spectator 87.1
audience 85.4
child 85
wear 77.8
war 77.5
musician 76.4
veil 74.8

Imagga
created on 2022-03-05

people 33.4
bride 27.8
person 26.2
wedding 24.8
man 23.6
dress 23.5
groom 22.7
couple 22.6
love 20.5
male 19.8
happy 19.4
group 19.3
men 18.9
happiness 18.8
married 18.2
dancer 17.9
adult 17.5
women 16.6
performer 16.5
marriage 16.1
together 15.8
crowd 15.3
art 14.7
outfit 14.6
silhouette 14.1
brass 13.2
bouquet 13.2
portrait 12.9
entertainer 12.4
walking 12.3
fashion 12.1
two 11.8
gown 11.7
bridal 11.7
business 11.5
kin 11.4
celebration 11.2
wind instrument 11
outdoors 10.7
life 10.6
party 10.3
girls 10
smile 10
attractive 9.8
dance 9.8
human 9.7
businessman 9.7
looking 9.6
flowers 9.6
smiling 9.4
church 9.2
joy 9.2
outdoor 9.2
black 9.1
park 9
summer 9
romantic 8.9
success 8.8
wed 8.8
engagement 8.7
wife 8.5
face 8.5
friends 8.4
old 8.4
suit 8.1
team 8.1
pedestrian 8
fan 8
musical instrument 8
husband 7.8
day 7.8
veil 7.8
ceremony 7.8
wall 7.7
youth 7.7
clothing 7.5
traditional 7.5
teamwork 7.4
action 7.4
cheerful 7.3
handsome 7.1
family 7.1
cool 7.1
travel 7

Google
created on 2022-03-05

Black-and-white 85
Style 83.9
Crowd 80.7
Adaptation 79.3
Jacket 76.1
Event 73.5
Monochrome photography 73
Art 72.8
Monochrome 72.7
Suit 70.9
Crew 70.2
Sunglasses 69.2
Vintage clothing 66.4
History 66.3
Hat 65.2
Team 64.5
Tree 61.9
Illustration 56
Font 55.7
Recreation 53.7

Microsoft
created on 2022-03-05

person 99.6
text 95.9
people 93.6
clothing 91.4
outdoor 90.4
group 84.6
woman 82.1
standing 80.7
dance 78.2
man 76.8
crowd 76.6
black and white 61.4
posing 60.5
clothes 15.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 98.1%
Happy 0.6%
Confused 0.6%
Sad 0.3%
Disgusted 0.2%
Surprised 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 90%
Calm 100%
Happy 0%
Sad 0%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 97.5%
Confused 1.1%
Sad 0.6%
Disgusted 0.4%
Surprised 0.1%
Happy 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 96.3%
Happy 53%
Calm 41%
Surprised 2.3%
Sad 2.1%
Confused 0.7%
Disgusted 0.4%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 29-39
Gender Female, 63.3%
Calm 76.9%
Happy 12%
Fear 3.9%
Surprised 3.5%
Angry 1.1%
Disgusted 0.9%
Confused 0.9%
Sad 0.9%

AWS Rekognition

Age 47-53
Gender Male, 99.6%
Sad 66.3%
Confused 18.3%
Calm 8.3%
Happy 4.2%
Disgusted 1.1%
Angry 0.6%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 24-34
Gender Male, 79%
Surprised 36.1%
Confused 18.5%
Happy 18.4%
Sad 13%
Calm 6.6%
Disgusted 3%
Angry 2.6%
Fear 1.8%

AWS Rekognition

Age 2-8
Gender Male, 64.5%
Calm 94.1%
Fear 2.3%
Sad 1.8%
Happy 0.9%
Angry 0.4%
Disgusted 0.3%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 22-30
Gender Female, 79.4%
Calm 86%
Sad 5.6%
Fear 3%
Happy 2.6%
Angry 1.3%
Disgusted 0.7%
Surprised 0.5%
Confused 0.3%

AWS Rekognition

Age 6-16
Gender Male, 53.2%
Calm 90%
Surprised 4.6%
Fear 1.7%
Sad 1.6%
Happy 1.4%
Disgusted 0.4%
Confused 0.3%
Angry 0.2%

AWS Rekognition

Age 6-16
Gender Female, 77.6%
Calm 66.2%
Fear 30.6%
Disgusted 1.6%
Sad 0.8%
Happy 0.4%
Angry 0.2%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 22-30
Gender Male, 58.2%
Calm 30.6%
Fear 30%
Angry 28.8%
Sad 3.1%
Surprised 2.9%
Confused 2.2%
Disgusted 1.5%
Happy 0.8%

AWS Rekognition

Age 23-33
Gender Male, 96.5%
Calm 98.8%
Happy 0.6%
Sad 0.3%
Surprised 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 98.3%
Happy 81.6%
Calm 5.7%
Fear 3.2%
Surprised 2.9%
Sad 2.1%
Angry 1.5%
Confused 1.5%
Disgusted 1.5%

AWS Rekognition

Age 31-41
Gender Female, 90.6%
Sad 71.3%
Calm 12.8%
Surprised 7.5%
Happy 4.5%
Confused 1.4%
Angry 1.1%
Disgusted 1%
Fear 0.5%

AWS Rekognition

Age 7-17
Gender Male, 86.1%
Fear 44.1%
Calm 40.5%
Surprised 3.7%
Disgusted 3.4%
Sad 3%
Happy 2.5%
Angry 1.7%
Confused 1.2%

AWS Rekognition

Age 24-34
Gender Male, 78.3%
Calm 70.5%
Sad 15%
Fear 3.9%
Angry 3.8%
Confused 2.4%
Disgusted 2%
Happy 1.2%
Surprised 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99.2%
Person 99.2%
Person 99.2%
Person 99.2%
Person 98.9%
Person 98.4%
Person 91.7%
Person 91.5%
Person 90.3%
Person 90.1%
Person 88.1%
Person 79.8%
Person 65.2%
Person 61%
Person 60.2%
Person 43.2%

Text analysis

Amazon

131
M13--YT3X

Google