Human Generated Data

Title

Fall River, Mass.

Date

1981

People

Artist: Sage Sohier, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.940

Copyright

© Sage Sohier

Human Generated Data

Title

Fall River, Mass.

People

Artist: Sage Sohier, American born 1954

Date

1981

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.940

Copyright

© Sage Sohier

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.6
Person 98.9
Person 98.3
Person 98.1
Person 88.2
Neighborhood 78
Building 78
Urban 78
Clothing 74.2
Apparel 74.2
Housing 68.1
Person 67.6
Art 59.7
Drawing 57.9
Pedestrian 56.4
Person 43.3

Clarifai
created on 2023-10-25

people 99.9
monochrome 99.4
group together 98.6
many 98.4
group 97.7
man 97.4
street 96.9
adult 95.8
woman 93.9
child 87
two 86
no person 84.8
crowd 83.5
city 82.7
wear 82.4
administration 81
war 81
black and white 80.4
vehicle 79.3
police 78

Imagga
created on 2022-01-09

barbershop 41
structure 32.6
shop 32.4
patio 32.4
architecture 25.4
mercantile establishment 25.2
chair 25
building 23.9
area 23.5
city 19.1
house 18.4
table 17.4
travel 16.9
place of business 16.7
sky 15.9
water 15.3
sea 14.1
tourism 14
furniture 13.6
modern 13.3
ocean 13.3
vacation 13.1
construction 12.8
seat 12.7
scene 12.1
island 11.9
summer 11.6
urban 11.4
home 11.2
outdoor 10.7
residential 10.5
room 10.3
beach 10.3
wood 10
chairs 9.8
landscape 9.7
scenic 9.6
luxury 9.4
holiday 9.3
establishment 9.3
relax 9.3
power 9.2
industrial 9.1
plant 9
coast 9
outdoors 9
river 8.9
wheeled vehicle 8.9
interior 8.8
factory 8.8
light 8.7
roof 8.6
skyline 8.5
clouds 8.4
relaxation 8.4
shore 8.4
exterior 8.3
resort 8.2
sand 7.9
stone 7.8
tree 7.7
industry 7.7
apartment 7.7
old 7.7
cityscape 7.6
bridge 7.6
housing 7.5
restaurant 7.5
traditional 7.5
destination 7.5
window 7.5
town 7.4
boat 7.4
machine 7.4
street 7.4
business 7.3
tourist 7.2
sun 7.2
landmark 7.2
steel 7.2
work 7.1
wall 7.1
mobile home 7.1
glass 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 93.7
outdoor 87.5
black and white 81.2
black 79.2
person 75.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Female, 97.9%
Calm 56.2%
Happy 40.8%
Angry 0.8%
Sad 0.8%
Surprised 0.6%
Confused 0.3%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 48-54
Gender Female, 99.6%
Calm 49.3%
Happy 20.1%
Disgusted 7.4%
Angry 5.5%
Surprised 5.3%
Confused 5.3%
Fear 4.8%
Sad 2.3%

AWS Rekognition

Age 4-12
Gender Male, 99.6%
Happy 86.8%
Sad 7%
Calm 3.5%
Fear 0.9%
Angry 0.8%
Confused 0.4%
Disgusted 0.3%
Surprised 0.3%

AWS Rekognition

Age 2-8
Gender Male, 96.3%
Fear 33.6%
Sad 28.3%
Calm 19.3%
Angry 8.8%
Disgusted 5.4%
Confused 2.2%
Surprised 1.2%
Happy 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Google

E
E