Human Generated Data

Title

Marc and Genita, Lunch

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.178

Copyright

© Nicholas Nixon

Human Generated Data

Title

Marc and Genita, Lunch

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.178

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
LCD Screen 88.7
Electronics 88.7
Screen 88.7
Monitor 88.7
Display 88.7
Pc 79.3
Computer 79.3
Finger 78.6
Person 69.3
Keyboard 67.7
Furniture 63.6
Computer Keyboard 57.4
Hardware 57.4
Computer Hardware 57.4
Shelf 57.2
Clothing 57.1
Apparel 57.1

Clarifai
created on 2023-10-26

people 99.7
portrait 99.5
child 98.1
adult 97.1
boy 97
monochrome 97
one 96.9
man 96.6
son 95.7
street 94.1
room 93.2
telephone 91.6
music 91
two 90.4
sit 87.1
baby 86.4
facial expression 86.2
girl 86.2
family 85.9
furniture 83.8

Imagga
created on 2022-01-09

man 39
person 34.3
male 29.2
computer 27
adult 24.6
laptop 24.5
people 21.8
sitting 21.5
work 21.2
guy 19.9
hand 19.2
adolescent 18.8
office 18.6
indoors 18.5
business 18.2
home 16.8
one 16.4
black 16.3
juvenile 16
job 15
working 15
happy 14.4
handsome 14.3
portrait 14.2
notebook 13.9
smiling 13.7
lifestyle 13.7
technology 13.4
businessman 13.2
corporate 12.9
keyboard 12.7
hands 12.2
men 12
looking 12
desk 11.4
professional 11.1
relaxing 10.9
face 10.7
attractive 10.5
room 10.2
executive 10.1
indoor 10
sexy 9.6
sofa 9.6
color 9.5
model 9.3
portable computer 9.3
worker 9.1
human 9
employee 9
interior 8.8
couple 8.7
love 8.7
wireless 8.6
reading 8.6
smile 8.6
one person 8.5
camera 8.3
student 8.3
holding 8.3
equipment 7.8
happiness 7.8
typing 7.8
couch 7.7
personal computer 7.7
using 7.7
finance 7.6
communication 7.6
relaxed 7.5
leisure 7.5
mature 7.4
focus 7.4
suit 7.3
alone 7.3
table 7.3

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.9
text 98.7
human face 98.4
black and white 93.5
clothing 93.4
indoor 90.2
monochrome 69.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Female, 100%
Sad 70.6%
Calm 23.6%
Fear 2.7%
Angry 1%
Disgusted 0.6%
Surprised 0.6%
Confused 0.5%
Happy 0.4%

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

people portraits 84.3%
pets animals 13.7%
paintings art 1.5%

Text analysis

Amazon

COMP
AM
VAU

Google

MAII AN
MAII
AN