Human Generated Data

Title

Untitled (A Touch of Class, 1972) from "Dream Girls"

Date

1989-1990

People

Artist: Deborah Bright, American born 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P1998.66

Human Generated Data

Title

Untitled (A Touch of Class, 1972) from "Dream Girls"

People

Artist: Deborah Bright, American born 1950

Date

1989-1990

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P1998.66

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Person 99.5
Human 99.5
Person 99.1
Restaurant 96.8
Person 95.5
Person 89
Sitting 85.2
Person 78.5
Cafe 77.9
Person 77.5
Cafeteria 75.1
Crowd 73.2
People 69.6
Dating 68
Pub 64
Meal 61.6
Food 61.6
Food Court 61.3
Bar Counter 60.5
Couch 56.2
Furniture 56.2

Clarifai
created on 2021-04-03

people 99.9
group 99.7
adult 99.1
portrait 98.8
woman 98.3
group together 97.1
monochrome 95.5
man 95.3
three 93.1
wear 89.9
music 89.4
facial expression 89.1
actor 88.1
four 87.6
teacher 86.8
sit 86.7
education 85.8
school 85.6
actress 84.8
retro 84.3

Imagga
created on 2021-04-03

salon 54.4
hairdresser 40.4
people 35.7
adult 32.4
portrait 30.4
fashion 30.2
attractive 27.3
person 26.9
happy 24.4
face 24.2
pretty 22.4
man 22.2
brunette 21.8
together 21
smile 20.7
model 20.2
sexy 20.1
hair 19.8
cheerful 19.5
male 19.2
smiling 18.8
women 18.2
fun 17.2
two 16.9
couple 16.6
lady 16.2
studio 16
happiness 15.7
dress 15.4
youth 15.3
lifestyle 15.2
style 14.8
love 14.2
expression 13.7
friends 13.2
party 12.9
girls 12.8
make 12.7
group 12.1
sensual 11.8
child 11.5
sensuality 10.9
joy 10.9
dark 10.9
painter 10.6
looking 10.4
black 10.4
friendship 10.3
20s 10.1
indoor 10
stylish 9.9
romantic 9.8
clothing 9.8
costume 9.6
elegance 9.2
emotion 9.2
entertainment 9.2
teenager 9.1
blond 9.1
posing 8.9
body 8.8
education 8.7
work 8.6
cute 8.6
eyes 8.6
hairstyle 8.6
elegant 8.6
fashionable 8.5
shop 8.2
school 8.1
professional 7.7
hand 7.6
desk 7.6
passion 7.5
holding 7.4
glasses 7.4
classroom 7.4
teen 7.4
occupation 7.3
children 7.3
gorgeous 7.3
color 7.2
holiday 7.2
romance 7.1
family 7.1
kid 7.1
interior 7.1
look 7

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

human face 98.2
person 97.8
text 97
sketch 95.2
drawing 95
clothing 89.9
woman 84.8
people 73
smile 72.9
white 68.1
old 50.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-60
Gender Male, 99.8%
Calm 49.8%
Angry 12%
Surprised 9.9%
Happy 9.1%
Fear 8%
Sad 5.3%
Disgusted 3.9%
Confused 2%

AWS Rekognition

Age 29-45
Gender Female, 99.5%
Calm 91.1%
Sad 7%
Angry 1%
Fear 0.4%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 22-34
Gender Female, 91.7%
Happy 57.9%
Calm 11.5%
Surprised 9.4%
Confused 9.3%
Sad 6.2%
Fear 3.5%
Disgusted 1.3%
Angry 0.9%

AWS Rekognition

Age 6-16
Gender Male, 75%
Sad 95.1%
Fear 4.7%
Calm 0.1%
Angry 0%
Confused 0%
Happy 0%
Surprised 0%
Disgusted 0%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

people portraits 58.3%
paintings art 41.2%

Text analysis

Amazon

TC12

Google

TC 12
TC
12