Human Generated Data

Title

Untitled (dancing at wedding reception)

Date

1960s

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.896

Human Generated Data

Title

Untitled (dancing at wedding reception)

People

Artist: Bachrach Studios, founded 1868

Date

1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.896

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.5
Person 99.5
Person 99.4
Person 99.2
Person 99
Poster 95.7
Advertisement 95.7
Person 94.5
Clothing 92.5
Apparel 92.5
Person 88.9
Person 82
People 68.1
Person 67
Girl 59.4
Female 59.4
Crowd 57.5
Gown 56.6
Fashion 56.6

Clarifai
created on 2023-10-26

people 99.7
man 98.3
woman 97.6
group 96.8
portrait 96.6
monochrome 95.3
collage 95
adult 94.3
wedding 93.7
girl 93.4
art 92
documentary 90.7
wear 88.9
sepia 88.4
child 87.4
street 85.9
bride 82.8
vintage 82.8
son 81.3
retro 80.5

Imagga
created on 2022-01-23

groom 28.1
man 21.5
male 18.5
people 18.4
couple 17.4
person 17.1
black 15.1
portrait 14.2
old 13.9
bride 13.9
world 13
office 12.8
adult 12.8
vintage 12.4
dress 11.7
businessman 11.5
business 10.9
brass 10.8
blackboard 10.8
family 10.7
happy 10.6
wind instrument 10.2
wedding 10.1
aged 9.9
room 9.9
art 9.9
men 9.4
happiness 9.4
professional 9.4
two 9.3
love 8.7
child 8.5
future 8.4
musical instrument 8.2
retro 8.2
romantic 8
clothing 7.9
photograph 7.8
bouquet 7.8
color 7.8
party 7.7
grunge 7.7
one 7.5
envelope 7.4
symbol 7.4
newspaper 7.3
cheerful 7.3
alone 7.3
smiling 7.2
celebration 7.2
women 7.1
card 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.4
person 96.2
wall 95.3
posing 94.8
gallery 91.3
old 90.1
clothing 88.9
woman 84.6
dress 82.4
group 81.2
people 70.4
room 52.1
vintage 34.6
picture frame 29.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Female, 81.7%
Calm 24.8%
Confused 19.3%
Happy 16.2%
Surprised 15%
Sad 9.1%
Disgusted 7.4%
Angry 6.3%
Fear 1.9%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Happy 93.4%
Surprised 3.9%
Calm 0.6%
Sad 0.6%
Angry 0.6%
Fear 0.4%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 54-64
Gender Male, 99.7%
Calm 91.4%
Angry 5.4%
Sad 1%
Surprised 0.8%
Fear 0.4%
Confused 0.4%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 24-34
Gender Female, 99.4%
Calm 98.1%
Surprised 1.1%
Happy 0.3%
Confused 0.2%
Sad 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 59-67
Gender Male, 100%
Fear 34.1%
Angry 24.6%
Surprised 22.6%
Happy 12.3%
Disgusted 2.5%
Calm 1.9%
Sad 1.5%
Confused 0.7%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Happy 76.4%
Calm 10.7%
Confused 4.9%
Surprised 3.9%
Sad 2%
Fear 0.8%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 42.7%
Sad 42%
Surprised 5.1%
Angry 4.5%
Confused 2.8%
Happy 1.4%
Disgusted 0.7%
Fear 0.7%

AWS Rekognition

Age 33-41
Gender Female, 97.9%
Fear 93.4%
Sad 3.2%
Calm 1.6%
Surprised 0.9%
Disgusted 0.5%
Confused 0.3%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 25-35
Gender Male, 80%
Sad 79.7%
Calm 13.5%
Happy 2.4%
Fear 1.6%
Disgusted 1.1%
Angry 1%
Confused 0.6%
Surprised 0.2%

AWS Rekognition

Age 35-43
Gender Male, 98.8%
Sad 96%
Calm 2.5%
Angry 0.8%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0%

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Poster 95.7%

Categories

Imagga

paintings art 99.6%

Text analysis

Amazon

X