Human Generated Data

Title

South Boston, Mass.

Date

1982

People

Artist: Sage Sohier, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.941

Copyright

© Sage Sohier

Human Generated Data

Title

South Boston, Mass.

People

Artist: Sage Sohier, American born 1954

Date

1982

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.4
Person 99.4
Couch 99.2
Furniture 99.2
Person 93.3
Footwear 90.8
Shoe 90.8
Apparel 90.8
Clothing 90.8
Person 85.5
Room 83.9
Living Room 83.9
Indoors 83.9
Shoe 82.5
Person 71.7
Bed 69.2
Text 67.4
People 64.2
Sitting 62.6
Bedroom 58
Baby 57.1
Undershirt 56.1

Imagga
created on 2022-01-09

person 47.9
adult 35.2
home 34.3
people 31.8
man 30.9
indoors 30.8
patient 29.2
male 26.5
happy 25.1
sitting 22.3
room 22.1
case 21.4
sick person 20.8
happiness 19.6
computer 19.3
smiling 18.8
house 18.4
couch 18.4
lifestyle 18.1
office 17.8
business 17.6
working 16.8
couple 16.6
men 16.3
work 16
businessman 15.9
laptop 15.7
smile 15.7
indoor 15.5
together 14.9
businesswoman 14.5
casual 14.4
professional 13.7
women 13.4
newspaper 13.3
adults 13.3
executive 12.9
20s 12.8
team 12.5
family 12.5
interior 12.4
chair 12.4
portrait 12.3
clothing 12.2
cheerful 12.2
child 12.1
group 12.1
corporate 12
life 11.8
worker 11.6
job 11.5
attractive 11.2
alone 11
sofa 10.5
businesspeople 10.4
females 10.4
senior 10.3
product 10.1
relaxation 10.1
holding 9.9
30s 9.6
looking 9.6
desk 9.6
reading 9.5
talking 9.5
living 9.5
meeting 9.4
manager 9.3
modern 9.1
classroom 8.9
creation 8.8
living room 8.8
mother 8.6
comfortable 8.6
break 8.6
resting 8.6
hospital 8.5
occupation 8.3
technology 8.2
student 7.9
nurse 7.8
loving 7.6
bed 7.6
one person 7.5
enjoyment 7.5
clothes 7.5
mature 7.4
teamwork 7.4
window 7.3
lady 7.3
cushion 7.3
confident 7.3
color 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.2
clothing 97
person 96.7
indoor 95.4
drawing 95.3
wall 95.3
woman 86.9
sketch 83.7
black and white 78.2
bed 77.1
man 72.8
clothes 25.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Female, 100%
Sad 72.4%
Calm 20.4%
Confused 4.7%
Surprised 0.8%
Angry 0.6%
Fear 0.6%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 48-54
Gender Male, 94.6%
Calm 73.6%
Surprised 20.3%
Confused 1.4%
Happy 1.3%
Sad 1%
Disgusted 0.9%
Angry 0.8%
Fear 0.7%

AWS Rekognition

Age 0-3
Gender Male, 99.7%
Calm 97.2%
Sad 1.5%
Happy 0.6%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%
Surprised 0.1%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 90.8%
Bed 69.2%

Captions

Microsoft

a man and a woman sitting on a bed 60.3%
a group of people sitting on a bed 60.2%
a person sitting on a bed 60.1%

Text analysis

Amazon

Cards
rebound,
Sports
Cards rebound, 5-4
-
5-4
Peter's

Google

Sports
Cards
rebound,
Sports Cards rebound, 5-4
5-4