Human Generated Data

Title

Tina and Lois

Date

1993

People

Artist: Vaughn Sills, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Beinecke Fund, 2.2002.1091

Copyright

© Vaughn Sills

Human Generated Data

Title

Tina and Lois

People

Artist: Vaughn Sills, American 20th century

Date

1993

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Clothing 89.9
Apparel 89.9
Furniture 75.8
Face 75
Undershirt 67.2
Indoors 66.9
Sitting 66.3
Room 65.1
Female 62.5
Advertisement 61.7
Screen 60
Electronics 60
Poster 58.5
Finger 57.8
Bedroom 57.2
Skin 57.1
Monitor 56.5
Display 56.5

Imagga
created on 2022-01-09

television 41.2
adult 33
person 29.9
people 29
telecommunication system 29
happy 25.1
portrait 24.6
man 24.2
billboard 22.6
attractive 22.4
pretty 21.7
male 20.6
sexy 20.1
computer 19.6
laptop 18.6
signboard 18.3
smile 17.8
smiling 16.6
love 16.6
business 16.4
face 16.3
looking 16
monitor 15.8
couple 15.7
sitting 15.5
office 15.4
lifestyle 15.2
one 14.9
model 14.8
women 14.2
desk 14.2
work 14.1
blond 14.1
human 13.5
hair 13.5
black 13.4
happiness 13.3
spectator 12.9
elevator 12.9
fashion 12.8
casual 12.7
working 12.4
structure 12.3
together 12.3
lady 12.2
fun 12
table 11.6
cute 11.5
communication 10.9
sensual 10.9
girlfriend 10.6
lifting device 10.3
senior 10.3
two 10.2
electronic equipment 10.1
dark 10
businesswoman 10
businessman 9.7
technology 9.6
body 9.6
passion 9.4
car 9.4
equipment 9.1
suit 9
home 8.8
brunette 8.7
child 8.7
education 8.7
loving 8.6
youth 8.5
professional 8.4
mother 8.4
leisure 8.3
device 8.2
indoor 8.2
girls 8.2
gorgeous 8.2
dress 8.1
family 8
posing 8
call 7.9
erotic 7.9
guy 7.9
husband 7.8
boy 7.8
hug 7.7
corporate 7.7
boyfriend 7.7
expression 7.7
skin 7.6
studio 7.6
wife 7.6
group 7.3
success 7.2
bartender 7.2
lovely 7.1
job 7.1
indoors 7
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-59
Gender Male, 99.3%
Calm 68%
Angry 10.9%
Disgusted 7%
Sad 5.7%
Confused 3.6%
Fear 2.1%
Surprised 1.8%
Happy 0.9%

AWS Rekognition

Age 18-24
Gender Female, 100%
Calm 98.9%
Sad 0.6%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Happy 0%

Microsoft Cognitive Services

Age 45
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a man sitting in front of a window 79.4%
a man and a woman sitting in front of a window 61.8%
a man and woman sitting next to a window 59.3%