Human Generated Data

Title

Mary, Esther, and Ellis, members of the Organization Black Belt Citizens Fighting for Health and Justice, Uniontown, Alabama

Date

2014

People

Artist: Jeff Rich, American born 1977

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2020.155

Copyright

© Jeff Rich

Human Generated Data

Title

Mary, Esther, and Ellis, members of the Organization Black Belt Citizens Fighting for Health and Justice, Uniontown, Alabama

People

Artist: Jeff Rich, American born 1977

Date

2014

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-05-03

Human 99.5
Person 99.5
Person 98.9
Couch 98.3
Furniture 98.3
Sitting 98
Person 94.7
Living Room 93.7
Indoors 93.7
Room 93.7
Chair 92.8
Restaurant 92.2
Plant 82.8
Food 82.6
Flooring 74.9
Interior Design 74.1
Food Court 72.4
Flower 72
Blossom 72
Wood 70.5
Meal 68.1
Senior Citizen 65.6
Chair 65.4
Cafe 62.3
Display 59.4
Screen 59.4
Monitor 59.4
Electronics 59.4
Cafeteria 58.4
Flower Arrangement 56.9
Shelf 55.4
Cabinet 55.3

Imagga
created on 2022-05-03

man 37
room 33.6
people 30.7
teacher 30.4
director 29.7
table 28.9
person 28.3
male 27.7
sitting 27.5
meeting 27.3
couple 27
group 25
smiling 23.9
indoors 23.7
adult 23.7
together 21.9
business 21.9
home 21.6
team 21.5
businessman 21.2
office 20.7
classroom 20
lifestyle 19.5
professional 19.4
happy 18.8
restaurant 18.2
women 18.2
work 18.1
men 18
educator 18
businesswoman 17.3
interior 16.8
laptop 16.5
suit 16.3
teamwork 15.8
chair 15.6
desk 15.1
corporate 14.6
computer 14.5
communication 14.3
smile 14.3
adults 14.2
enjoyment 14.1
education 13.9
cheerful 13.8
indoor 13.7
conference 13.7
executive 13.7
full length 13.6
happiness 13.3
modern 13.3
talking 13.3
togetherness 13.2
presentation 13
occupation 12.8
student 12.6
holding 12.4
color 12.2
friends 12.2
food 11.8
portrait 11.7
family 11.6
friendship 11.2
dinner 11.2
children 10.9
leisure activity 10.8
two people 10.7
entrepreneur 10.7
working 10.6
females 10.4
technology 10.4
child 10.3
love 10.3
20 24 years 9.8
meal 9.8
colleagues 9.7
class 9.6
sofa 9.6
lunch 9.6
daughter 9.6
kin 9.5
living 9.5
board 9.4
relationship 9.4
drink 9.2
worker 9.2
20s 9.2
girls 9.1
attractive 9.1
job 8.9
casual clothing 8.8
30s 8.7
workplace 8.6
businesspeople 8.5
eating 8.4
manager 8.4
relaxation 8.4
coffee 8.3
service 8.3
inside 8.3
life 8.2
lady 8.1
center 8.1
romantic 8
looking 8
elementary age 7.9
25 29 years 7.9
day 7.9
40s 7.8
discussion 7.8
glass 7.8
dad 7.8
emotions 7.8
content 7.7
couch 7.7
luxury 7.7
diversity 7.7
dining 7.6
contemporary 7.5
senior 7.5
father 7.5
mature 7.4
wine 7.4
emotion 7.4
success 7.2
hall 7.2

Google
created on 2022-05-03

Furniture 94.4
Blue 91.9
Green 91.2
Window 91.2
Picture frame 88.7
Lighting 87.2
Interior design 84.7
Table 84.5
Chair 83.2
Leisure 76.9
Event 73.5
Flooring 72.9
Curtain 70.8
Room 67.1
Living room 66.6
Conversation 65
Sitting 63.7
Luggage and bags 63.4
T-shirt 62.2
Fun 61.8

Microsoft
created on 2022-05-03

sitting 99.2
person 96.8
wall 96.2
clothing 90.1
table 85.1
furniture 82.4
woman 66.7
chair 61.2
dining table 17.2
dining room 7.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-62
Gender Female, 50.5%
Calm 45.9%
Fear 14.4%
Surprised 12.1%
Happy 10.7%
Sad 10.1%
Disgusted 4.4%
Angry 3%
Confused 1.4%

AWS Rekognition

Age 48-54
Gender Female, 100%
Calm 69.1%
Sad 48.8%
Surprised 6.5%
Fear 6.1%
Disgusted 1.2%
Angry 1.1%
Happy 0.8%
Confused 0.3%

AWS Rekognition

Age 56-64
Gender Female, 97.8%
Calm 99.2%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Confused 0.2%
Angry 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 92.8%

Captions

Microsoft

a man and a woman sitting on a bench 78.1%
a person sitting on a bench 78%
a man and woman sitting on a bench 74.3%

Text analysis

Google

OLDER than DIRT
OLDER
than
DIRT