Human Generated Data

Title

Untitled (Halloween party, children in costumes dancing)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18888

Human Generated Data

Title

Untitled (Halloween party, children in costumes dancing)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18888

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.8
Apparel 99.8
Person 99.2
Human 99.2
Person 99.1
Person 98.9
Person 98.6
Person 96.3
Person 95.6
Person 95.6
Dress 95.5
Person 94.8
Person 93.4
Female 91.6
Person 90.7
Person 84.5
Gown 84.4
Fashion 84.4
Robe 83
Person 81.7
Chair 80.9
Furniture 80.9
People 79
Woman 78.2
Wedding 77.5
Shoe 72.7
Footwear 72.7
Shoe 69.3
Wedding Gown 68.3
Girl 65
Person 64.4
Portrait 64.2
Photography 64.2
Face 64.2
Photo 64.2
Crowd 63
Evening Dress 61.5
Leisure Activities 61.1
Suit 60.7
Coat 60.7
Overcoat 60.7
Shorts 58.1
Bridegroom 55.5
Bride 55.3

Clarifai
created on 2023-10-22

people 100
many 99.3
group 99.3
child 98.6
dancing 97.9
group together 97.5
adult 97.5
woman 95.4
crowd 94.9
man 93.9
street 92.9
wear 92.6
several 91.1
monochrome 90.8
dancer 89.9
spectator 89.7
outfit 88.8
music 88.8
recreation 88.2
boy 84.9

Imagga
created on 2022-03-05

people 27.9
man 24.8
person 22.1
male 19.2
adult 18.7
men 16.3
world 15.9
women 15.8
couple 13.9
dress 12.6
room 12.5
happy 11.9
happiness 11.7
portrait 11.6
together 11.4
group 11.3
musical instrument 11.2
family 10.7
teacher 10.6
fashion 10.5
bride 10.5
business 10.3
wind instrument 10.2
girls 10
silhouette 9.9
life 9.9
style 9.6
groom 9.5
kin 9.4
art 9.3
dance 9.2
city 9.1
modern 9.1
human 9
fun 9
stage 9
performer 8.9
home 8.8
smiling 8.7
dancing 8.7
two 8.5
travel 8.4
professional 8.4
black 8.4
hand 8.4
leisure 8.3
active 8.3
wedding 8.3
child 8.2
brass 8.2
clothing 8.2
lady 8.1
new 8.1
dancer 8
interior 8
lifestyle 7.9
businessman 7.9
educator 7.9
love 7.9
model 7.8
play 7.8
party 7.7
mother 7.7
old 7.7
joy 7.5
dark 7.5
patient 7.4
tradition 7.4
building 7.2
holiday 7.2
smile 7.1

Google
created on 2022-03-05

Dress 88
Art 81.5
Vintage clothing 74.3
Font 73.2
Hat 68.9
Event 68.5
Painting 67.6
Illustration 65.9
Visual arts 64.9
Suit 64.9
History 63.7
Advertising 63.6
Room 62.2
Team 61
Crew 59.3
Monochrome 58.9
Photo caption 58.7
Recreation 56.6
Retro style 55.7
Vintage advertisement 54.7

Microsoft
created on 2022-03-05

text 98.8
person 98.2
clothing 93.1
woman 78.1
dance 72
drawing 70.4
group 68.1
cartoon 67
old 50.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 74.9%
Sad 35.4%
Calm 23.4%
Confused 21.8%
Happy 10.4%
Angry 3.3%
Surprised 2.5%
Disgusted 1.6%
Fear 1.5%

AWS Rekognition

Age 33-41
Gender Female, 62.7%
Happy 51.9%
Sad 39.2%
Calm 7.7%
Angry 0.3%
Disgusted 0.3%
Surprised 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Female, 74.1%
Surprised 24.6%
Confused 24%
Calm 20.1%
Sad 15.9%
Fear 6.5%
Happy 6.3%
Angry 1.4%
Disgusted 1.4%

AWS Rekognition

Age 48-54
Gender Female, 55.8%
Calm 98.5%
Happy 0.7%
Sad 0.5%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Female, 78.5%
Calm 94.3%
Happy 2.4%
Confused 1.9%
Sad 0.5%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 89.7%
Happy 79.9%
Sad 13%
Calm 2.1%
Angry 1.4%
Disgusted 1.3%
Surprised 0.9%
Fear 0.8%
Confused 0.7%

AWS Rekognition

Age 18-24
Gender Female, 99.3%
Happy 34.7%
Sad 20.9%
Calm 11.6%
Confused 9.4%
Surprised 8.2%
Disgusted 6.4%
Fear 6.2%
Angry 2.5%

AWS Rekognition

Age 34-42
Gender Male, 99.8%
Calm 96.7%
Sad 1.6%
Surprised 0.9%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.2%
Person 99.1%
Person 98.9%
Person 98.6%
Person 96.3%
Person 95.6%
Person 95.6%
Person 94.8%
Person 93.4%
Person 90.7%
Person 84.5%
Person 81.7%
Person 64.4%
Shoe 72.7%
Shoe 69.3%

Text analysis

Amazon

HM2CO