Human Generated Data

Title

Untitled (man and woman with long dress train sitting at head of room with children in period costumes sitting on both sides in foreground)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9248

Human Generated Data

Title

Untitled (man and woman with long dress train sitting at head of room with children in period costumes sitting on both sides in foreground)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9248

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Clothing 98.9
Apparel 98.9
Person 98.7
Person 98.5
Person 97.1
Person 94.8
Dress 90.9
Shop 87
Person 86.9
Painting 85.9
Art 85.9
Evening Dress 84.8
Fashion 84.8
Gown 84.8
Robe 84.8
Female 73.2
Window Display 67.5
Woman 59.8
Boutique 57.3
People 56.5
Lace 55.5
Text 55.4

Clarifai
created on 2023-10-26

people 99.9
adult 98.3
veil 98.3
group 97.9
princess 97.5
wear 97.4
woman 97.1
man 95.1
wedding 94.2
print 93.8
art 93.7
illustration 93.1
child 92.9
ceremony 91.3
leader 91.2
prince 90.7
dress 90.6
bride 90.1
outfit 90.1
furniture 88.8

Imagga
created on 2022-01-23

groom 41.9
grand piano 33.6
stringed instrument 27.9
piano 27.1
keyboard instrument 24.4
percussion instrument 22
musical instrument 20.4
people 19.5
man 19.5
person 16.2
male 15.6
relaxation 12.6
business 12.1
black 12.1
water 12
bride 11.5
sky 11.5
light 11.4
outdoors 11.2
men 11.2
religion 10.7
history 10.7
adult 10.6
businessman 10.6
travel 10.6
sitting 10.3
love 10.3
color 10
silhouette 9.9
office 9.6
statue 9.6
women 9.5
happy 9.4
alone 9.1
old 9.1
dress 9
sunset 9
holiday 8.6
relax 8.4
city 8.3
professional 8.1
day 7.8
couple 7.8
outdoor 7.6
fun 7.5
tourism 7.4
building 7.4
vacation 7.4
wedding 7.4
lady 7.3
relaxing 7.3
art 7.2
success 7.2
happiness 7
architecture 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.2
wedding dress 92.2
bride 81.7
black and white 79.6
person 74.7
clothing 73.2
woman 71.9
dress 50.7
painting 18

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 98.8%
Happy 88.6%
Calm 3%
Surprised 2%
Confused 1.8%
Disgusted 1.8%
Sad 1.5%
Angry 1.1%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Male, 99.5%
Sad 47.8%
Calm 27.2%
Fear 11.8%
Happy 7.2%
Angry 2.1%
Surprised 1.6%
Confused 1.3%
Disgusted 1.1%

AWS Rekognition

Age 27-37
Gender Male, 56.7%
Sad 75.1%
Angry 8.5%
Disgusted 5.9%
Happy 3.2%
Calm 2.8%
Surprised 1.9%
Confused 1.3%
Fear 1.3%

AWS Rekognition

Age 39-47
Gender Male, 76.9%
Happy 78.6%
Surprised 8.7%
Calm 3.6%
Sad 2.6%
Fear 2.6%
Disgusted 1.5%
Angry 1.4%
Confused 1%

AWS Rekognition

Age 26-36
Gender Male, 85%
Happy 88.8%
Surprised 8.8%
Calm 1%
Fear 0.6%
Disgusted 0.3%
Angry 0.2%
Sad 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 85.9%

Captions

Microsoft
created on 2022-01-23

calendar 80.7%