Human Generated Data

Title

Untitled (debutantes)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19220

Human Generated Data

Title

Untitled (debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19220

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.7
Human 98.7
Person 98.4
Person 90.3
Person 89.5
Person 87.6
Person 86.3
Person 85
Person 83.3
Text 78.2
Person 77.9
Person 75
Person 70.9
Screen 67.9
Electronics 67.9
Monitor 64.8
Display 64.8
LCD Screen 59.8
Crowd 57.3
Person 56.4

Clarifai
created on 2023-10-22

people 99.7
man 97.8
adult 96.9
woman 96.8
group 96.5
group together 92.6
portrait 91.8
music 90.4
wear 85.3
actress 83
actor 82.9
administration 79.5
leader 79.1
education 79
child 78.7
many 78.7
dress 76.9
school 76.6
indoors 76.2
musician 74.6

Imagga
created on 2022-02-25

man 26.9
people 25.6
room 22.7
person 21.3
male 20.6
adult 19.4
interior 17.7
modern 17.5
balcony 16.8
home 16.7
musical instrument 16.6
indoor 16.4
indoors 15.8
happy 15.7
men 15.4
women 15
table 14.4
office 14
percussion instrument 13.9
sitting 13.7
business 13.4
couple 13.1
house 12.5
window 12.4
businessman 12.3
portrait 12.3
smile 12.1
classroom 11.6
smiling 11.6
lifestyle 11.6
professional 11.3
floor 10.2
chair 10.2
inside 10.1
communication 10.1
dress 9.9
child 9.8
family 9.8
cheerful 9.7
group 9.7
furniture 9.6
living 9.5
happiness 9.4
two 9.3
bride 9.2
teacher 9
design 9
sofa 8.9
job 8.8
together 8.8
work 8.6
glass 8.6
meeting 8.5
executive 8.3
worker 8.1
light 8
father 7.9
love 7.9
living room 7.8
boy 7.8
black 7.8
building 7.8
travel 7.7
corporate 7.7
structure 7.5
fun 7.5
mature 7.4
care 7.4
board 7.2
domestic 7.2
team 7.2
television 7.1
kid 7.1
mother 7
architecture 7
blackboard 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 97.1
indoor 95.2
person 93.5
clothing 92.1
black 82.7
wedding dress 78.6
woman 76.8
dress 66.9
fireplace 26.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 100%
Happy 96.8%
Sad 1%
Disgusted 0.5%
Fear 0.5%
Surprised 0.5%
Confused 0.3%
Calm 0.2%
Angry 0.2%

AWS Rekognition

Age 30-40
Gender Female, 100%
Happy 54.5%
Confused 12%
Disgusted 10.4%
Sad 8%
Surprised 7.4%
Calm 2.8%
Angry 2.7%
Fear 2.2%

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 57.2%
Calm 10.9%
Sad 9%
Angry 8.4%
Fear 5.7%
Surprised 3%
Confused 2.9%
Disgusted 2.9%

AWS Rekognition

Age 23-31
Gender Female, 100%
Calm 61.2%
Happy 10.6%
Angry 6.8%
Surprised 5.8%
Confused 5.4%
Disgusted 5.2%
Fear 2.6%
Sad 2.4%

AWS Rekognition

Age 24-34
Gender Female, 99.9%
Happy 99.7%
Surprised 0.1%
Fear 0.1%
Sad 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Calm 0%

AWS Rekognition

Age 16-24
Gender Female, 100%
Calm 99.2%
Confused 0.3%
Sad 0.2%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 86.9%
Confused 3.4%
Calm 2.7%
Fear 2%
Sad 1.9%
Surprised 1.3%
Angry 1.1%
Disgusted 0.8%

AWS Rekognition

Age 22-30
Gender Female, 100%
Calm 34.2%
Confused 22.5%
Sad 21.7%
Surprised 11.4%
Disgusted 3%
Angry 2.8%
Happy 2.6%
Fear 1.8%

AWS Rekognition

Age 21-29
Gender Female, 100%
Happy 99.5%
Angry 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%
Sad 0.1%
Confused 0%
Calm 0%

AWS Rekognition

Age 23-33
Gender Female, 79.6%
Disgusted 59.4%
Surprised 11.5%
Angry 10.5%
Confused 7.6%
Happy 4.4%
Calm 3.6%
Fear 1.9%
Sad 1%

AWS Rekognition

Age 20-28
Gender Female, 100%
Happy 99.6%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0%
Calm 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.7%
Person 98.4%
Person 90.3%
Person 89.5%
Person 87.6%
Person 86.3%
Person 85%
Person 83.3%
Person 77.9%
Person 75%
Person 70.9%
Person 56.4%

Text analysis

Amazon

64
DEC

Google

DEC 64 174 3.
DEC
64
174
3.