Human Generated Data

Title

Untitled (five women in hats gathered around railing outdoors, looking at camera)

Date

1934

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13013

Human Generated Data

Title

Untitled (five women in hats gathered around railing outdoors, looking at camera)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13013

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 100
Apparel 100
Person 99
Human 99
Person 98.9
Person 98.8
Person 98.5
Person 96.7
Robe 96.3
Fashion 96.3
Gown 95.1
Wedding 92.3
Bride 86.9
Wedding Gown 86.9
Female 86.5
Helmet 85.7
Woman 72.7
Bridegroom 66.1
People 61.4
Suit 60.1
Coat 60.1
Overcoat 60.1
Person 59.3
Photography 59
Photo 59
Dress 57.7

Clarifai
created on 2023-10-29

people 99.9
group 99.4
adult 97.4
child 97.4
woman 96.4
man 96.4
veil 95.6
several 93.5
group together 92.9
many 92.4
wear 91.1
music 89.9
leader 89
administration 87.5
outfit 87.5
wedding 87.4
ceremony 87
actor 84.2
princess 83.6
musician 83.5

Imagga
created on 2022-02-05

musical instrument 37.9
wind instrument 34
sax 32.1
accordion 24.4
man 23.5
keyboard instrument 19.7
people 18.4
person 17.7
male 17
adult 16.7
brass 15.8
black 13.8
groom 12.8
style 11.9
bride 11.5
art 11.3
men 11.2
portrait 11
dress 10.8
light 10.7
couple 10.5
cornet 10.2
room 10.2
music 10
musician 10
businessman 9.7
life 9.6
wedding 9.2
business 9.1
holding 9.1
religion 9
musical 8.6
fashion 8.3
silhouette 8.3
human 8.2
professional 8.2
suit 8.1
hair 7.9
women 7.9
love 7.9
sing 7.9
face 7.8
model 7.8
wall 7.7
two 7.6
device 7.6
world 7.4
bass 7.3
sensuality 7.3
color 7.2
dirty 7.2
sexy 7.2
lifestyle 7.2
romantic 7.1
dance 7.1
happiness 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

wedding dress 95.8
bride 91.3
text 90.7
person 88.8
clothing 86.4
wedding 86
black and white 66.6
woman 64.2
dress 53.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 67.3%
Calm 99.4%
Surprised 0.5%
Happy 0%
Sad 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 48-54
Gender Female, 72.5%
Sad 73.1%
Happy 19%
Calm 2.2%
Fear 1.6%
Confused 1.6%
Disgusted 1%
Angry 0.9%
Surprised 0.6%

AWS Rekognition

Age 33-41
Gender Male, 89.5%
Calm 49.2%
Happy 17.2%
Fear 12.8%
Surprised 7.1%
Sad 6.5%
Angry 3.4%
Disgusted 2%
Confused 1.9%

AWS Rekognition

Age 31-41
Gender Male, 78.3%
Happy 46.1%
Disgusted 43.8%
Confused 3%
Calm 2.2%
Surprised 1.9%
Sad 1.8%
Angry 0.7%
Fear 0.4%

AWS Rekognition

Age 21-29
Gender Male, 81.6%
Happy 74.8%
Calm 11.3%
Sad 5.8%
Fear 2.9%
Confused 1.7%
Disgusted 1.7%
Surprised 1.3%
Angry 0.6%

AWS Rekognition

Age 37-45
Gender Male, 93%
Happy 84.4%
Calm 7.3%
Disgusted 3.3%
Sad 1.6%
Surprised 1.4%
Confused 1.1%
Fear 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Helmet
Person 99%
Person 98.9%
Person 98.8%
Person 98.5%
Person 96.7%
Person 59.3%
Helmet 85.7%

Categories

Text analysis

Amazon

NZ
yours NZ DOD
DOD
yours