Human Generated Data

Title

Eddie, Tammy, Sam, Genita and Lucien, Science

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.191

Copyright

© Nicholas Nixon

Human Generated Data

Title

Eddie, Tammy, Sam, Genita and Lucien, Science

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.191

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.4
Finger 96.1
Person 84.5
Person 75.4
Text 60.9

Clarifai
created on 2023-10-25

people 99.2
monochrome 97.9
woman 96.3
man 94.3
adult 94
child 93.6
desk 91.1
education 90.3
two 89.6
boy 88.4
hand 88
group 87.3
composition 87.2
school 87
concentration 86.2
table 85
book series 82.6
writing 82.3
portrait 81.7
girl 81.7

Imagga
created on 2022-01-09

person 43.6
working 35.4
hand 34.2
business 34.1
work 33
office 31.6
pen 30.8
paper 30.6
people 30.2
man 29.2
computer 25.9
male 24.9
scholar 24.7
laptop 23.7
job 23
intellectual 22.7
document 22.3
student 21.8
businessman 21.2
hands 20.9
technology 20.8
writing 20.7
desk 19.9
occupation 19.3
adult 19.1
education 19.1
finger 18.5
table 18.2
professional 16.8
executive 16.6
corporate 16.3
sitting 16.3
keyboard 16.3
home 16
human 15.8
typing 14.6
busy 14.5
worker 14.4
workplace 14.3
close 14.3
closeup 13.5
communication 13.5
meeting 13.2
learning 13.2
looking 12
contract 11.7
paperwork 11.7
lifestyle 11.6
holding 11.6
fingers 11.4
reading 11.4
career 11.4
write 11.3
success 11.3
men 11.2
notebook 11
finance 11
conference 10.8
happy 10.7
indoors 10.6
text 10.5
manager 10.3
suit 10.1
employee 9.8
one 9.7
report 9.7
book 9.6
course 9.6
pencil 9.5
women 9.5
planner 9.4
teamwork 9.3
child 9.2
data 9.1
newspaper 9
signing 8.9
information 8.9
stretcher 8.8
equipment 8.8
agreement 8.7
planning 8.7
using 8.7
studying 8.6
notes 8.6
smile 8.6
casual 8.5
note 8.3
indoor 8.2
businesswoman 8.2
school 8.1
detail 8.1
financial 8
room 8
enrollee 7.9
button 7.9
signature 7.9
black 7.8
concentration 7.7
type 7.7
personal 7.7
plan 7.6
device 7.5
key 7.5
study 7.5
focus 7.4
letter 7.3
successful 7.3
team 7.2
handsome 7.1
day 7.1
boy 7.1
modern 7
litter 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.3
text 97.9
sitting 97
indoor 93
black and white 90.4
drawing 86.8
handwriting 86.8
watch 78.7
people 58.1
clothing 57.9
human face 55.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 7-17
Gender Male, 57.2%
Calm 98.4%
Happy 0.5%
Surprised 0.3%
Fear 0.2%
Confused 0.1%
Sad 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 19-27
Gender Male, 59.1%
Calm 77.8%
Sad 17.8%
Disgusted 1.2%
Angry 1.2%
Fear 1.1%
Happy 0.6%
Surprised 0.2%
Confused 0.2%

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 50.5%
people portraits 38.8%
food drinks 8.4%

Text analysis

Amazon

the
was
first
it
ater
ry's
from the
not
that
is
that first it was turning
school.
water
couse was not
ry's water waster
A
ater the way it look
e from the school.
Y was is
wife
turning
was wife agin
spring water are
spring
look
waster
X, A
wete
way it
are
couse
®
A wete
agin
e
X,
Y
COSTE

Google

১ % Pring uiater. ter tht wiyit fek के from tene Schoel. nS wetcr Was fe ouse was noty
%
Pring
uiater.
ter
tht
wiyit
fek
के
from
tene
Schoel.
nS
wetcr
Was
fe
ouse
was
noty