Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16092.2

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.8
Human 99.8
Handrail 99
Banister 99
Patio 98.5
Person 98.3
Porch 97.5
Person 97.5
Person 96.6
Person 96.5
Person 94.8
Railing 94
Person 92.4
Person 90.2
Pergola 87.5
Person 80.3
Person 78.3
Outdoors 65.2
Garden 62.5
Arbour 62.5
Window 60.4
Door 57.9
Person 45
Person 43.5

Imagga
created on 2022-02-11

car 33.2
city 26.6
architecture 26.5
transportation 24.2
structure 24.2
building 23.5
travel 21.1
urban 21
truck 20.5
road 19.9
wheeled vehicle 18.6
sky 18.5
passenger car 18.3
billboard 18
bridge 17.1
traffic 16.1
highway 15.4
landscape 14.9
street 14.7
transport 14.6
vehicle 14.5
signboard 13.3
industry 12
motor vehicle 11.3
modern 11.2
sea 10.9
office 10.7
water 10.7
automobile 10.5
moving van 10.5
buildings 10.4
van 10.4
business 10.3
clouds 10.1
river 9.8
steel 9.7
downtown 9.6
conveyance 9.3
house 9.2
tourism 9.1
old 9.1
vacation 9
passenger 8.9
expressway 8.6
auto 8.6
motion 8.6
glass 8.6
construction 8.6
bay 8.5
tourist 8.4
town 8.3
ocean 8.3
landmark 8.1
tramway 8.1
tower 8.1
shuttle bus 8
light 8
deck 8
station 8
lamp 7.7
drive 7.6
exterior 7.4
speed 7.3
trailer 7.1

Google
created on 2022-02-11

Water 95
Sky 93.2
Plant 87.8
Building 83.5
Tree 82.5
Shade 82.1
Line 81.8
Travel 81
Rectangle 79.8
Urban design 79.7
Leisure 77.8
Tints and shades 77.1
T-shirt 69.3
Lake 67.7
Fence 65.7
Metal 65.6
Handrail 58.7
Landscape 58.4
Room 57.4
Horizon 56.5

Microsoft
created on 2022-02-11

outdoor 96.2
clothing 92.7
person 91.4
sky 87.9
text 83.2
tree 75.6
vacation 74
water 68.7
lake 62.1

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Female, 99.1%
Calm 88.3%
Sad 6.7%
Surprised 2.2%
Happy 1.4%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Female, 51.3%
Calm 98%
Sad 1%
Disgusted 0.3%
Confused 0.2%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 85.4%
Sad 55.3%
Calm 19.9%
Disgusted 10%
Happy 3.8%
Confused 3.8%
Angry 3.6%
Fear 1.9%
Surprised 1.8%

AWS Rekognition

Age 6-14
Gender Male, 82.9%
Confused 72.6%
Calm 15.6%
Surprised 4.2%
Sad 2.8%
Fear 2.1%
Disgusted 1.8%
Angry 0.6%
Happy 0.3%

AWS Rekognition

Age 31-41
Gender Male, 98.3%
Calm 98.7%
Confused 0.5%
Disgusted 0.2%
Sad 0.2%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Calm 61.6%
Sad 35.5%
Happy 1.3%
Angry 0.6%
Fear 0.4%
Confused 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 74.6%
Sad 17.6%
Confused 2.9%
Happy 2.1%
Angry 1%
Disgusted 0.8%
Surprised 0.6%
Fear 0.4%

AWS Rekognition

Age 22-30
Gender Male, 97.7%
Sad 45.7%
Calm 44.7%
Confused 3.6%
Angry 2.3%
Disgusted 1.4%
Happy 0.9%
Fear 0.7%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing in front of a window 80%
a group of people standing in front of a building 79.9%
a group of people walking in front of a window 79.8%