Human Generated Data

Title

Dana Jones

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.179

Copyright

© Nicholas Nixon

Human Generated Data

Title

Dana Jones

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.179

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Face 100
Human 100
Person 97.5
Head 96.1
Portrait 82.7
Photography 82.7
Photo 82.7
Smile 74.6
Text 61.6
Poster 58.8
Advertisement 58.8
Finger 58.1

Clarifai
created on 2023-10-25

portrait 99.9
child 99.8
monochrome 99.8
son 99.3
people 99.1
boy 98.2
girl 97.9
baby 97.2
school 96
one 93.6
smile 93.3
face 92.7
eye 92.6
art 90.7
studio 90.2
black and white 88.8
documentary 88.3
education 87.6
nonwhite 87.4
class 85.8

Imagga
created on 2022-01-09

person 38.2
laptop 32.5
adult 32.4
business 31.6
people 29
office 27.3
computer 24.9
portrait 24
child 23.9
attractive 23.1
working 22.1
happy 22
notebook 21.3
businesswoman 20.9
smile 20.7
face 20.6
work 20.4
pretty 20.3
portable computer 19.8
corporate 18.9
man 18.9
smiling 18.8
professional 18.6
sitting 18.1
male 17.6
businessman 16.8
one 16.4
black 15.7
suit 15.6
looking 15.2
manager 14.9
personal computer 14.8
desk 14.3
businesspeople 14.2
executive 14
holding 13.2
support 12.9
secretary 12.5
model 12.5
car 12
alone 11.9
worker 11.6
job 11.5
indoors 11.4
cheerful 11.4
career 11.4
call 11.3
eyes 11.2
hair 11.1
clothing 11.1
lifestyle 10.9
telephone 10.5
look 10.5
success 10.5
student 10.2
confident 10
color 10
digital computer 9.9
fashion 9.8
human 9.8
lady 9.7
close 9.7
technology 9.7
brunette 9.6
serious 9.5
women 9.5
tie 9.5
service 9.3
communication 9.2
inside 9.2
successful 9.2
indoor 9.1
blond 9.1
sexy 8.8
assistant 8.8
happiness 8.6
cute 8.6
expression 8.5
head 8.4
hand 8.4
necktie 8.3
20s 8.3
friendly 8.2
single 8.2
scholar 8
handsome 8
bow tie 8
home 8
book 7.8
juvenile 7.7
automobile 7.7
casual 7.6
customer 7.6
thinking 7.6
shirt 7.5
intellectual 7.4
phone 7.4
mirror 7.4
occupation 7.3
dress 7.2
car mirror 7.1
table 7
modern 7

Google
created on 2022-01-09

Forehead 98.5
Hair 98.4
Nose 98.4
Face 98.3
Cheek 97.9
Smile 96.1
Shirt 94.8
Eyebrow 94.6
Eye 93.7
Mouth 91.8
Black 89.6
Eyelash 89.4
Jaw 88
Flash photography 87.8
Ear 87.6
Happy 85.6
Iris 85.1
Black-and-white 83.9
Style 83.9
Adaptation 79.3

Microsoft
created on 2022-01-09

person 99.5
text 98.8
child 96
indoor 94.8
portrait 94.6
human face 91.4
face 89.6
boy 87.6
toddler 84.9
young 82.3
black and white 81.5
handwriting 77.7
eyes 71.4
smiling 70
baby 53.5
posing 52.1
staring 15.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Female, 99.4%
Calm 61.6%
Angry 19.2%
Confused 6.2%
Happy 3.3%
Surprised 3.2%
Sad 2.5%
Disgusted 2.1%
Fear 2%

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%

Categories

Imagga

paintings art 94.9%
people portraits 4.7%

Text analysis

Amazon

wrote
land
III.
land III. wa
wa
e
POW
se
POW HOT Haijo
HOT
Haijo