Human Generated Data

Title

Dana and Eddie, Soil Acidity Test

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.205

Copyright

© Nicholas Nixon

Human Generated Data

Title

Dana and Eddie, Soil Acidity Test

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.205

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 98.7
Person 95.1
Couch 68.2
Furniture 68.2
Monitor 64.4
Electronics 64.4
Display 64.4
Screen 64.4
Finger 62.8
Shelf 57.1
Hair 56.9
People 56.9

Clarifai
created on 2023-10-25

people 99.9
portrait 98.9
child 98.6
adult 98.5
monochrome 98.3
two 98.2
woman 97.7
group 97.5
man 97.4
three 97.4
boy 96.1
son 94.2
family 91.8
group together 90.9
offspring 89
four 88.7
indoors 87.2
education 86.3
room 86.2
music 85.6

Imagga
created on 2022-01-09

computer 45.1
office 42.4
man 37.8
working 37.1
person 36.1
desk 35.5
laptop 35.5
male 34.8
business 34.6
adult 34.6
work 33
people 32.9
sitting 30.9
professional 27.2
businessman 26.5
executive 26.1
smiling 26.1
corporate 25.8
happy 25.1
job 24.8
looking 24
indoors 22.9
home 21.5
businesspeople 20.9
businesswoman 20
bartender 20
technology 19.3
smile 19.3
table 18.7
notebook 17.8
attractive 17.5
women 17.4
indoor 16.4
team 16.1
casual 16.1
worker 16
one 15.7
lifestyle 15.2
handsome 15.2
mature 14.9
suit 14.9
education 14.7
men 14.6
color 14.5
communication 14.3
career 14.2
happiness 14.1
occupation 13.8
pretty 13.3
meeting 13.2
keyboard 13.1
teamwork 13
portrait 13
success 12.9
clothing 12.9
student 12.3
senior 12.2
successful 11.9
day 11.8
formal 11.5
paper 11
consultant 10.7
face 10.7
workplace 10.5
pen 10.4
manager 10.3
necktie 10.2
glasses 10.2
camera 10.2
scholar 10.2
alone 10.1
confident 10
television 9.9
modern 9.8
seated 9.8
colleagues 9.7
group 9.7
employment 9.7
partner 9.7
couple 9.6
wireless 9.5
two 9.3
room 9.3
document 9.3
horizontal 9.2
20s 9.2
employee 9.1
cheerful 8.9
partners 8.8
elderly 8.6
boss 8.6
bright 8.6
tie 8.5
adults 8.5
sit 8.5
learning 8.5
newspaper 8.4
monitor 8.4
hand 8.4
classroom 8.2
aged 8.1
intellectual 8.1
teacher 8
bow tie 7.9
coat 7.9
together 7.9
60s 7.8
discussion 7.8
1 7.7
class 7.7
partnership 7.7
old 7.7
talking 7.6
college 7.6
telecommunication system 7.6
contemporary 7.5
holding 7.4
single 7.4
black 7.2
blond 7.2
school 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 100
text 98
clothing 94.9
black and white 85.6
human face 83.7
man 69.6
monochrome 65.7
crowd 0.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Male, 87.4%
Calm 88.8%
Sad 6.4%
Confused 1.6%
Surprised 0.9%
Angry 0.8%
Happy 0.6%
Disgusted 0.5%
Fear 0.5%

AWS Rekognition

Age 9-17
Gender Female, 99.8%
Sad 78.1%
Happy 6.5%
Disgusted 4.9%
Surprised 4.2%
Confused 2.3%
Angry 1.5%
Fear 1.3%
Calm 1.2%

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 93.7%
Confused 3.2%
Sad 1.2%
Fear 0.6%
Disgusted 0.5%
Angry 0.4%
Surprised 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 12
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

people portraits 77.5%
paintings art 22.1%

Text analysis

Amazon

01
01 do
do
PALAG
I

Google

PAIG
PAIG