Human Generated Data

Title

Judy and Ed O'Meara

Date

1988

People

Artist: Nancy Floyd, American born 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1793

Human Generated Data

Title

Judy and Ed O'Meara

People

Artist: Nancy Floyd, American born 1956

Date

1988

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.4
Person 98.4
Person 97.9
Clothing 93.4
Apparel 93.4
Face 83
Furniture 78.1
Couch 73.1
Shorts 71.7
Advertisement 71
Person 68.1
Skin 66.7
Poster 65.1
Collage 64.2
Finger 63.9
Man 63.6
Female 63.1
Photography 62.9
Photo 62.9
Tire 59.9
Sitting 59.4
Screen 58.7
Electronics 58.7
Monitor 58.7
Display 58.7
LCD Screen 58.7
Outdoors 56.9

Imagga
created on 2021-12-14

adult 34.3
man 31.6
people 29.6
person 28.2
male 28
couple 22.7
hairdresser 22.3
portrait 22
attractive 18.2
face 17.8
business 17.6
pretty 17.5
happy 16.9
professional 16.1
looking 15.2
women 15
human 15
smiling 14.5
two 14.4
office 13.7
love 13.4
work 13.3
happiness 13.3
executive 13.3
holding 13.2
child 13.1
lifestyle 13
sitting 12.9
hair 12.7
shop 12.6
brunette 12.2
cheerful 12.2
girls 11.8
indoors 11.4
smile 11.4
adolescent 11.3
expression 11.1
black 11
job 10.6
businessman 10.6
suit 10.5
boy 10.4
casual 10.2
guy 9.9
handsome 9.8
family 9.8
together 9.6
hands 9.6
corporate 9.5
youth 9.4
juvenile 9.3
model 9.3
20s 9.2
businesswoman 9.1
fun 9
lady 8.9
romantic 8.9
sexy 8.8
cute 8.6
men 8.6
businesspeople 8.5
females 8.5
binoculars 8.5
mature 8.4
teenager 8.2
style 8.2
romance 8
bride 8
painter 8
look 7.9
barbershop 7.8
depression 7.8
bow tie 7.7
hand 7.6
pair 7.6
togetherness 7.6
fashion 7.5
one person 7.5
emotion 7.4
indoor 7.3
confident 7.3
sensuality 7.3
worker 7.1
groom 7
parent 7
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 99.7
text 99.2
clothing 92.7
human face 90.6
man 90.1
black and white 68.7
people 63.7
concert 61.1
poster 52.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-35
Gender Male, 99.2%
Calm 96.4%
Happy 1.5%
Sad 1.1%
Angry 0.5%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 36-54
Gender Male, 53.8%
Calm 78.1%
Sad 15.1%
Angry 5.3%
Confused 1%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%
Surprised 0.1%

Microsoft Cognitive Services

Age 41
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a couple of people that are talking to each other 76.3%
a group of people looking at a phone 66.6%
a group of people looking at a cell phone 59.9%