Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16093.3

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.7
Person 99.7
Patio 99.3
Handrail 99.3
Banister 99.3
Porch 99.1
Person 99
Person 98.2
Railing 97.8
Person 97.6
Pergola 96
Person 93.7
Person 93.4
Person 91.5
Person 91.4
Person 84.9
Outdoors 74.4
Person 72
Person 67.6
Person 66.6
Garden 55.8
Arbour 55.8

Imagga
created on 2022-02-11

balcony 51.1
structure 32.1
city 30.8
building 30.1
architecture 28.9
sky 24.5
urban 22.7
travel 18.3
patio 16.5
bridge 16.4
water 14
clouds 13.5
house 13.4
landscape 12.6
ocean 12.4
area 12
modern 11.9
landmark 11.7
sea 11.7
tourism 11.5
buildings 11.3
office 11.2
town 11.1
tourist 11
road 10.8
street 10.1
transportation 9.9
vacation 9.8
billboard 9.8
business 9.7
holiday 9.3
outdoor 9.2
island 9.2
old 9.1
car 9
reflection 8.9
people 8.9
highway 8.7
glass 8.6
construction 8.6
university 8.5
cityscape 8.5
bay 8.5
transport 8.2
tower 8.1
trees 8
monitor 8
scene 7.8
equipment 7.7
summer 7.7
downtown 7.7
traffic 7.6
park 7.6
historical 7.5
electronic equipment 7.5
center 7.5
light 7.4
scenery 7.2
river 7.1
signboard 7.1
steel 7.1

Google
created on 2022-02-11

Water 95.1
Sky 95
Tree 87.6
Shade 86.5
Building 86.5
Architecture 85.9
Cloud 85.1
Plant 81.8
Travel 81.3
Urban design 79.9
Rectangle 78.9
Tints and shades 77.4
Leisure 77.3
Lake 76.6
Metal 69.2
Landscape 67.1
Fence 67
Roof 66.2
Room 64.6
Facade 64.1

Microsoft
created on 2022-02-11

tree 99.2
outdoor 94
text 93.6
person 92.5
clothing 91.1
sky 87.8
water 81.7
lake 76.9
vacation 66.3

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 84.9%
Sad 58.9%
Calm 22.3%
Happy 5.1%
Fear 3.8%
Angry 3.7%
Surprised 3.5%
Confused 1.8%
Disgusted 1%

AWS Rekognition

Age 19-27
Gender Male, 92.3%
Sad 57.8%
Fear 12.3%
Calm 10.1%
Disgusted 5.3%
Angry 4.6%
Confused 4.4%
Surprised 3.3%
Happy 2.2%

AWS Rekognition

Age 16-22
Gender Male, 77.1%
Sad 45.9%
Calm 44.5%
Fear 3%
Disgusted 2.2%
Angry 1.9%
Happy 1%
Confused 0.9%
Surprised 0.6%

AWS Rekognition

Age 27-37
Gender Male, 98.7%
Sad 36.1%
Calm 35.1%
Happy 12.1%
Angry 6%
Confused 5%
Disgusted 3.4%
Surprised 1.3%
Fear 0.9%

AWS Rekognition

Age 23-31
Gender Male, 99.5%
Calm 74.9%
Sad 21.8%
Confused 1%
Disgusted 0.8%
Angry 0.5%
Happy 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Male, 97.2%
Calm 64.9%
Sad 29.6%
Confused 2%
Angry 1%
Disgusted 0.9%
Happy 0.7%
Fear 0.6%
Surprised 0.4%

AWS Rekognition

Age 23-33
Gender Male, 83.4%
Calm 65.3%
Sad 18.2%
Confused 7.5%
Fear 3.8%
Angry 1.7%
Disgusted 1.4%
Happy 1.3%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a building 67.8%
a couple of people that are standing in front of a building 56.1%
a man that is standing in front of a building 56%

Text analysis

Amazon

MJE
Port E