Human Generated Data

Title

Untitled (man and woman on chairs, people standing behind)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16708

Human Generated Data

Title

Untitled (man and woman on chairs, people standing behind)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16708

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 99.4
Person 98.7
Clothing 98.6
Apparel 98.6
Person 96.5
Person 95.8
Person 94.9
Person 93.3
Clinic 93.1
Person 88.9
Person 85.9
Female 76.7
People 75.3
Fashion 73.6
Gown 72.1
Indoors 68.5
Robe 67.6
Dress 62
Wedding 60.6
Woman 60.1
Operating Theatre 56.3
Hospital 56.3

Clarifai
created on 2023-10-28

people 99.9
group 99.7
adult 97.6
man 96.6
woman 95.4
group together 95.3
wedding 93.4
veil 92.6
dancing 91.2
many 90.7
child 90.3
education 89.1
several 89.1
music 89
interaction 87.7
wear 87.3
actor 87.1
princess 86.2
dancer 86.1
outfit 85.1

Imagga
created on 2022-02-26

groom 100
negative 54.2
film 42.7
bride 38.5
wedding 34.9
photographic paper 33
couple 32.2
people 29.5
dress 27.1
bouquet 25.7
love 25.2
person 23
man 22.2
photographic equipment 22
marriage 20.9
happiness 19.6
married 19.2
celebration 18.3
two 16.9
gown 16.7
adult 16.6
happy 15.7
flowers 15.6
male 15.6
ceremony 15.5
husband 15.3
bridal 14.6
romantic 14.2
portrait 14.2
men 13.7
women 13.4
wife 13.3
flower 12.3
decoration 12.1
party 12
family 11.6
indoors 11.4
rose 11.2
home 11.2
wed 10.8
suit 10.8
veil 10.8
romance 10.7
lady 10.5
human 10.5
old 10.4
holiday 10
fashion 9.8
marble 9.8
engagement 9.6
together 9.6
tradition 9.2
cheerful 8.9
new 8.9
matrimony 8.9
interior 8.8
smiling 8.7
day 8.6
face 8.5
pair 8.5
senior 8.4
elegance 8.4
drink 8.3
traditional 8.3
medical 7.9
hair 7.9
newly 7.9
column 7.9
commitment 7.9
black 7.8
elegant 7.7
attractive 7.7
worker 7.6
future 7.4
room 7.3
group 7.2
lifestyle 7.2
history 7.1
working 7.1
coat 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

wedding dress 98.2
person 96.8
dress 96.3
bride 94.2
woman 89.8
clothing 88.5
text 74.8
old 50.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.1%
Sad 95.8%
Calm 1.7%
Confused 0.7%
Angry 0.6%
Disgusted 0.4%
Happy 0.3%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 50-58
Gender Male, 99%
Calm 36.2%
Surprised 32.8%
Angry 17.6%
Happy 5.7%
Sad 2.8%
Fear 1.9%
Confused 1.7%
Disgusted 1.3%

AWS Rekognition

Age 31-41
Gender Male, 86.1%
Sad 66.3%
Calm 12.9%
Angry 6.4%
Disgusted 4.5%
Happy 4.2%
Confused 2.6%
Surprised 2.5%
Fear 0.6%

AWS Rekognition

Age 26-36
Gender Female, 55.1%
Happy 46.9%
Calm 33.9%
Fear 7.3%
Surprised 4.6%
Sad 2.9%
Confused 2.1%
Angry 1.4%
Disgusted 1%

AWS Rekognition

Age 47-53
Gender Male, 97.7%
Calm 98.3%
Sad 1.4%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.4%
Person 98.7%
Person 96.5%
Person 95.8%
Person 94.9%
Person 93.3%
Person 88.9%
Person 85.9%

Categories

Text analysis

Amazon

2
N
KODAK--ITW

Google

MJI7-- YT37A°2-- AGO
MJI7--
YT37A°2--
AGO