Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16094.1

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16094.1

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.7
Human 99.7
Handrail 99.6
Banister 99.6
Person 98.9
Person 98.5
Person 98.4
Patio 95.9
Person 95.5
Porch 94.5
Person 92.8
Person 92.8
Railing 90.4
Person 87.1
Person 86.2
Person 76.4
Pergola 75.5
Person 71.8
Window 69.3
Person 64.8
Person 58.5
Arbour 57.5
Outdoors 57.5
Garden 57.5
Person 56.6
Person 55

Clarifai
created on 2023-10-29

street 99.5
city 99.1
girl 98.9
beach 98.9
sea 98.6
people 98.6
travel 98.4
architecture 98.2
hotel 97.4
ocean 97.3
urban 97.2
bridge 96.6
water 96.3
modern 95.8
light 95.6
window 95.6
summer 95.5
sight 95.5
sky 95.4
beautiful 95

Imagga
created on 2022-02-11

city 33.2
architecture 33
building 32.1
structure 27.7
park 24.6
sky 24.5
urban 23.6
car 23.5
bridge 21.1
tract 18.9
street 18.4
travel 16.9
modern 15.4
road 15.4
traffic 15.2
landscape 14.9
transportation 14.3
house 13.4
office 12.8
passenger car 12.5
exterior 12
transport 11.9
old 11.8
landmark 11.7
wheeled vehicle 11.7
tourism 11.5
conveyance 11.4
buildings 11.3
water 11.3
vehicle 11.3
billboard 11
sea 10.9
balcony 10.8
tower 10.7
highway 10.6
bay 10.4
business 10.3
construction 10.3
industry 10.2
town 10.2
clouds 10.1
tree 10
vacation 9.8
shuttle bus 9.6
automobile 9.6
signboard 9.4
window 9.3
ocean 9.1
new 8.9
downtown 8.6
attraction 8.6
cityscape 8.5
center 8.4
outdoor 8.4
tourist 8.4
historic 8.2
area 8.2
outdoors 8.2
reflection 8.1
sun 8
light 8
trees 8
steel 8
glass 7.8
summer 7.7
shuttle 7.7
barrier 7.6
scenery 7.2
horizon 7.2
river 7.1
day 7.1
public transport 7

Google
created on 2022-02-11

Water 93.9
Sky 92.9
Cloud 87.1
Shade 86.3
Rectangle 84.4
Travel 81.8
Urban design 80.8
Tree 80.3
Leisure 79.3
Plant 78.8
Tints and shades 77.4
Lake 67
Metal 65
Landscape 64.5
T-shirt 62.8
Room 62.6
Handrail 59
Arch 57.3
Horizon 56.2
Tourism 53.9

Microsoft
created on 2022-02-11

sky 88.4
person 86.6
outdoor 85.1
water 80.5
text 79.8
clothing 79.6
lake 78.3
tree 74.3
vacation 61.1
bridge 52.2
trip 51.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 68%
Calm 81%
Happy 12.7%
Sad 3.5%
Confused 1.1%
Angry 0.7%
Disgusted 0.5%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Male, 96.8%
Disgusted 91.6%
Sad 3%
Angry 1.8%
Fear 1.5%
Calm 1%
Surprised 0.4%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 27-37
Gender Male, 98.4%
Sad 73.8%
Happy 8.5%
Disgusted 8.3%
Calm 4.9%
Confused 3.4%
Angry 0.6%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 20-28
Gender Male, 91.7%
Calm 64.2%
Fear 17.1%
Sad 5.5%
Confused 4.7%
Disgusted 2.6%
Happy 2.5%
Surprised 1.8%
Angry 1.5%

AWS Rekognition

Age 30-40
Gender Male, 92%
Calm 80.2%
Sad 6.1%
Happy 5.7%
Disgusted 3.7%
Confused 1.3%
Surprised 1.2%
Angry 1.1%
Fear 0.6%

AWS Rekognition

Age 23-33
Gender Male, 88.9%
Sad 97.4%
Fear 1.6%
Disgusted 0.3%
Calm 0.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 98.9%
Person 98.5%
Person 98.4%
Person 95.5%
Person 92.8%
Person 92.8%
Person 87.1%
Person 86.2%
Person 76.4%
Person 71.8%
Person 64.8%
Person 58.5%
Person 56.6%
Person 55%

Categories

Text analysis

Amazon

3
tir
В
MADO
207 E L E L A tir
KODYK
207 E L E L A
EITN

Google

MAGO MJI
MAGO
MJI