Human Generated Data

Title

Untitled (debutantes)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19218

Human Generated Data

Title

Untitled (debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19218

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 97.7
Human 97.7
Person 97.2
Person 90.7
Person 88.7
Person 88.6
Person 88.3
Person 86
Person 83.3
Person 82.3
Shop 78.6
Person 77.8
Person 67
Text 65.6
Crowd 63.5
Clothing 62.2
Apparel 62.2
Person 61.8
Suit 58.1
Coat 58.1
Overcoat 58.1

Clarifai
created on 2023-10-22

people 99.7
woman 96.9
music 96.2
adult 96.2
man 96
portrait 95.5
dress 94.6
group 94
actress 91.5
actor 89.6
wear 89.5
model 88.5
singer 85.3
presentation 85
indoors 84
girl 84
fashion 82.7
dancer 80.5
musician 80.2
group together 80.1

Imagga
created on 2022-02-25

blackboard 27.4
balcony 18
shop 17.5
barbershop 17.4
people 17.3
window 16
man 14.1
old 13.9
black 12.6
architecture 12.5
room 12.2
building 12.1
glass 11.7
mercantile establishment 11.6
vintage 11.6
urban 11.4
male 11.3
person 10.8
bride 10.8
city 10.8
family 10.7
retro 10.6
interior 10.6
negative 10.6
musical instrument 10.4
dress 9.9
travel 9.9
design 9.6
film 9.5
art 9.2
wedding 9.2
indoor 9.1
business 9.1
stringed instrument 8.9
working 8.8
decoration 8.8
couple 8.7
flowers 8.7
upright 8.7
antique 8.7
bouquet 8.6
wall 8.5
grunge 8.5
percussion instrument 8.5
house 8.4
tourism 8.2
historic 8.2
structure 8.2
light 8
women 7.9
love 7.9
place of business 7.7
office 7.7
piano 7.4
street 7.4
aged 7.2
home 7.2
celebration 7.2
history 7.2
portrait 7.1
decor 7.1
businessman 7.1
paper 7.1
modern 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.6
wedding dress 96
clothing 92.3
dress 91.2
person 91
bride 88.1
woman 86.9
black 70.9
fireplace 29.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 100%
Disgusted 46.6%
Happy 36.7%
Confused 4.7%
Sad 4.1%
Calm 3.2%
Surprised 1.7%
Angry 1.7%
Fear 1.3%

AWS Rekognition

Age 24-34
Gender Female, 100%
Happy 99.5%
Sad 0.1%
Calm 0.1%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Female, 100%
Calm 97.7%
Confused 0.5%
Sad 0.4%
Happy 0.4%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Female, 100%
Happy 92.4%
Confused 2.2%
Surprised 1.2%
Calm 1.1%
Angry 0.9%
Sad 0.9%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 24-34
Gender Female, 100%
Calm 82.4%
Confused 6.6%
Happy 5.2%
Angry 2.4%
Surprised 1.4%
Disgusted 0.9%
Sad 0.6%
Fear 0.4%

AWS Rekognition

Age 21-29
Gender Female, 100%
Happy 99.3%
Angry 0.4%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%
Sad 0%
Calm 0%
Confused 0%

AWS Rekognition

Age 26-36
Gender Female, 100%
Sad 66.7%
Disgusted 14.2%
Happy 7.4%
Confused 3.5%
Angry 2.6%
Surprised 2.2%
Fear 2.1%
Calm 1.3%

AWS Rekognition

Age 24-34
Gender Female, 98.4%
Calm 70.1%
Sad 13.1%
Surprised 4.3%
Confused 3.5%
Happy 2.8%
Disgusted 2.7%
Angry 2.2%
Fear 1.3%

AWS Rekognition

Age 22-30
Gender Female, 100%
Happy 98.1%
Fear 0.5%
Angry 0.5%
Surprised 0.3%
Disgusted 0.2%
Sad 0.2%
Calm 0.1%
Confused 0.1%

AWS Rekognition

Age 16-24
Gender Female, 98.8%
Happy 95.5%
Sad 2.4%
Fear 0.7%
Surprised 0.4%
Disgusted 0.4%
Angry 0.2%
Confused 0.2%
Calm 0.1%

AWS Rekognition

Age 16-24
Gender Female, 70.5%
Happy 89%
Confused 4%
Angry 2.1%
Calm 1.3%
Fear 1.1%
Sad 1%
Surprised 0.8%
Disgusted 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.7%
Person 97.2%
Person 90.7%
Person 88.7%
Person 88.6%
Person 88.3%
Person 86%
Person 83.3%
Person 82.3%
Person 77.8%
Person 67%
Person 61.8%

Text analysis

Amazon

64
DEC
173
3

Google

DEC 64 673
DEC
64
673