Human Generated Data

Title

South Boston, Massachussetts

Date

1982

People

Artist: Sage Sohier, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1104

Copyright

© Sage Sohier

Human Generated Data

Title

South Boston, Massachussetts

People

Artist: Sage Sohier, American born 1954

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1104

Copyright

© Sage Sohier

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 99.5
Person 99.4
Person 96.9
Meal 91
Food 91
Person 89.2
Clothing 86.6
Apparel 86.6
Nature 86.5
Outdoors 86.1
Shelter 82.5
Building 82.5
Rural 82.5
Countryside 82.5
Person 71.6
Person 70.2
Shorts 67.9
Wood 63.9
Tree 63.8
Plant 63.8
Leisure Activities 61.1
Tire 57.2
Vacation 56.8
Screen 56
Electronics 56
Camping 55.1

Clarifai
created on 2023-10-25

people 99.9
group together 99.1
group 98.1
adult 98
man 97.3
woman 96.9
two 96.9
three 94.6
administration 94.5
child 93.4
boy 93.4
wear 92.7
home 90.5
campsite 90.1
family 87.9
portrait 86.4
four 84.8
art 84.5
son 82.5
furniture 82.4

Imagga
created on 2022-01-09

wheeled vehicle 35.7
shopping cart 35
handcart 31.1
outdoors 20.2
man 20.2
grass 16.6
chair 16.4
people 16.2
sky 15.9
male 15.6
outdoor 15.3
tool 15.2
container 15.1
musical instrument 15.1
conveyance 14.4
adult 14.3
summer 13.5
park 13.2
work 13.2
landscape 12.6
person 12.6
day 12.6
vehicle 12.5
tricycle 12.3
sitting 12
old 11.8
field 11.7
accordion 11
relax 10.9
tree 10.8
sun 10.5
outside 10.3
device 10
leisure 10
shovel 9.7
freedom 9.1
environment 9
meadow 9
keyboard instrument 8.9
lifestyle 8.7
business 8.5
bench 8.5
sport 8.4
house 8.4
lady 8.1
suit 8.1
working 8
seat 7.9
garden 7.8
sunny 7.7
winter 7.7
lawn 7.5
relaxation 7.5
wood 7.5
home 7.2
wind instrument 7.1
businessman 7.1
rural 7
travel 7
autumn 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.6
person 90.9
clothing 89.8
black and white 71.9
old 47.1
clothes 17.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.7%
Sad 88.3%
Calm 4.4%
Angry 2.7%
Confused 1.7%
Disgusted 1.2%
Fear 0.9%
Surprised 0.4%
Happy 0.3%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Calm 72.9%
Confused 16.7%
Angry 3.9%
Sad 3.5%
Disgusted 1.4%
Surprised 0.7%
Happy 0.5%
Fear 0.5%

AWS Rekognition

Age 31-41
Gender Female, 99.7%
Sad 98.6%
Confused 0.6%
Calm 0.3%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%
Happy 0%

AWS Rekognition

Age 2-8
Gender Female, 66.1%
Sad 99.5%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Happy 0%
Confused 0%
Calm 0%

AWS Rekognition

Age 2-8
Gender Male, 99.2%
Calm 64.7%
Confused 20.6%
Angry 8.9%
Sad 3.6%
Surprised 1.3%
Disgusted 0.5%
Fear 0.3%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Text analysis

Amazon

GMC
lite

Google

GM lite
GM
lite