Human Generated Data

Title

Laurence and Gina D'Agostino, Victoria Stone, N. Cambridge

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.220

Copyright

© Nicholas Nixon

Human Generated Data

Title

Laurence and Gina D'Agostino, Victoria Stone, N. Cambridge

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.220

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.6
Person 99.5
Tablecloth 98.3
Restaurant 98
Sitting 91.1
Food Court 87.6
Food 87.6
Meal 75.6
Dish 75.3
Dining Table 73.5
Furniture 73.5
Table 73.5
Senior Citizen 55.3
Eating 55.2

Clarifai
created on 2023-10-25

people 99.4
room 98.2
table 97.8
furniture 97.2
man 96.8
family 96.1
elderly 95.3
monochrome 94.3
indoors 94
group 93.3
adult 93.1
couple 91.9
group together 90.6
two 90.4
four 89
portrait 88.9
dining room 88.9
sit 88.4
woman 88.3
elder 88.3

Imagga
created on 2022-01-09

man 43.1
person 39
male 37.9
people 34.6
home 28.7
adult 27.8
couple 27
patient 25.6
indoors 25.5
meeting 25.5
office 21.2
together 21
smiling 19.5
business 17
happy 16.9
case 16.9
laptop 16.6
sitting 16.3
sick person 16.3
desk 16.2
talking 16.2
businessman 15.9
family 15.1
table 15
businesswoman 14.6
team 14.3
women 14.2
businesspeople 14.2
senior 14.1
professional 14
20s 13.8
group 13.7
room 13.7
computer 13.7
40s 13.6
30s 13.5
child 13
men 12.9
mother 12.1
smile 12.1
corporate 12
two 11.9
two people 11.7
holding 11.6
nurse 11.5
adults 11.4
mature 11.2
portrait 11
work 11
relaxing 10.9
lifestyle 10.8
colleagues 10.7
working 10.6
cheerful 10.6
teamwork 10.2
food 10.1
drink 10
cup 10
husband 9.9
job 9.7
dad 9.5
worker 9.5
day 9.4
happiness 9.4
friends 9.4
teacher 9.2
children 9.1
health 9
hospital 8.9
to 8.9
discussing 8.8
father 8.8
retired 8.7
mid adult 8.7
elderly 8.6
executive 8.6
workplace 8.6
wife 8.5
attractive 8.4
coffee 8.3
emotion 8.3
care 8.2
daughter 8.2
beverage 8.1
suit 8.1
kitchen 8.1
elementary age 7.9
boardroom 7.9
love 7.9
grandfather 7.9
glass 7.9
beverages 7.9
discussion 7.8
color 7.8
middle aged 7.8
couch 7.7
daytime 7.7
modern 7.7
meal 7.7
classroom 7.7
reading 7.6
house 7.5
occupation 7.3
successful 7.3
indoor 7.3
dinner 7.1
medical 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.9
indoor 92.5
human face 85.2
clothing 83.3
table 82.1
food 79.4
man 78.1
black and white 76.1
text 65.9
tableware 51.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Female, 100%
Calm 40.1%
Angry 15.7%
Surprised 10.3%
Disgusted 10%
Sad 8.7%
Fear 7.4%
Happy 5.8%
Confused 2.1%

AWS Rekognition

Age 53-61
Gender Female, 99.9%
Calm 39.9%
Sad 30.4%
Happy 25.6%
Angry 1.3%
Confused 1%
Disgusted 1%
Surprised 0.4%
Fear 0.4%

AWS Rekognition

Age 48-56
Gender Male, 95.9%
Calm 85.4%
Sad 8.1%
Fear 1.6%
Happy 1.5%
Surprised 1.2%
Confused 0.9%
Disgusted 0.6%
Angry 0.6%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 67
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

COOKIES
ER COOKIES
ER