Human Generated Data

Title

Elizabeth Donovan, Phillis Emsig, N. Cambridge

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.217

Copyright

© Nicholas Nixon

Human Generated Data

Title

Elizabeth Donovan, Phillis Emsig, N. Cambridge

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.217

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Finger 93.4
Sitting 92.6
Person 85.7
Bird 84.2
Animal 84.2
Face 69.3
Glasses 63.5
Accessories 63.5
Accessory 63.5
Window 59
Shelf 58.9
LCD Screen 55.4
Electronics 55.4
Screen 55.4
Monitor 55.4
Display 55.4
Cafe 55.3
Restaurant 55.3

Clarifai
created on 2023-10-26

monochrome 99.6
people 99.6
elderly 99.2
portrait 98.8
street 98.1
man 97.6
adult 97
old 96.6
elder 96.3
black and white 95.5
two 89.9
art 89.9
one 89.6
documentary 86.8
chair 84.6
music 83.7
walking stick 83.2
leader 81.3
woman 75.7
scientist 75

Imagga
created on 2022-01-09

computer 39.7
man 37.6
senior 33.7
people 33.5
person 31.6
home 28.7
male 28.4
indoors 28.1
smiling 26.8
laptop 26.1
sitting 25.8
adult 25.8
seller 25.1
office 24.9
happy 23.2
mature 22.3
retired 22.3
grandma 21.2
working 20.3
old 20.2
retirement 20.2
elderly 20.1
work 19.6
keyboard 18.2
business 17.6
couple 17.4
grandfather 17.4
desk 17
portrait 16.2
together 15.8
lifestyle 15.2
smile 15
technology 14.1
musical instrument 13.1
worker 13
face 12.8
glasses 12
professional 12
looking 12
pensioner 12
women 11.9
room 11.9
communication 11.8
happiness 11.8
monitor 11.7
typing 11.7
family 11.6
businessman 11.5
camera 11.1
businesswoman 10.9
teaching 10.7
husband 10.5
wife 10.4
education 10.4
table 10.4
one person 10.4
men 10.3
notebook 10.2
casual 10.2
active 9.9
cheerful 9.8
lady 9.7
two people 9.7
one 9.7
reading 9.5
corporate 9.5
day 9.4
learning 9.4
executive 9.3
inside 9.2
pretty 9.1
suit 9
job 8.8
scholar 8.8
specialist 8.8
older 8.7
boy 8.7
businesspeople 8.5
enjoying 8.5
attractive 8.4
indoor 8.2
piano 8.2
relaxing 8.2
intellectual 8.2
handsome 8
gray hair 7.9
grandmother 7.8
discussion 7.8
chair 7.7
child 7.7
age 7.6
talking 7.6
stringed instrument 7.6
meeting 7.5
mother 7.5
relaxed 7.5
student 7.3
occupation 7.3
gray 7.2
hair 7.1
look 7

Google
created on 2022-01-09

Black-and-white 85.4
Gesture 85.3
Style 83.9
People 77.7
Monochrome 73.8
Monochrome photography 72.5
Sitting 68.4
Glass 65
Eyewear 64.8
Stock photography 63.5
History 62.3
Art 60.2
Window 58.3
Conversation 56.5
Room 52.8
Wrinkle 51.9
Vintage clothing 50.4
Door 50.3

Microsoft
created on 2022-01-09

person 99.9
black and white 96.5
text 96.4
man 91
monochrome 89.3
human face 81.9
glasses 76.9
street 73.3
clothing 71.5
old 44.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 56-64
Gender Male, 99.5%
Calm 95.9%
Sad 1.8%
Surprised 0.5%
Disgusted 0.5%
Happy 0.4%
Confused 0.4%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 41-49
Gender Male, 99.8%
Calm 98.3%
Confused 0.7%
Surprised 0.3%
Angry 0.3%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%
Sad 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Bird 84.2%
Glasses 63.5%

Categories

Imagga

paintings art 55.5%
pets animals 27.1%
people portraits 15.6%