Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16095.1

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16095.1

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.9
Human 99.9
Handrail 99.5
Banister 99.5
Person 98.2
Person 98.2
Person 97.3
Railing 95.9
Person 95.5
Person 94.7
Patio 94.7
Porch 93.3
Person 93.1
Person 85.9
Person 83.8
Pergola 76.8
Person 76.1
Person 61
Outdoors 59.8
Grass 56.7
Plant 56.7
Person 55.1
Person 53.9
Person 49.2

Clarifai
created on 2023-10-29

people 99.3
street 98.6
travel 98.1
city 98
girl 97.6
architecture 97.5
beach 96.7
man 96.7
urban 95.3
light 95.2
summer 95
sky 94.4
sea 93.9
bridge 93.9
park 93.7
portrait 93.7
recreation 93.6
tree 93.5
hotel 93.3
ocean 93.2

Imagga
created on 2022-02-11

city 34.1
architecture 30.9
building 27.5
urban 22.7
sky 22.5
structure 20.7
travel 19.7
bridge 16.8
car 14.9
tourism 14.8
house 13.4
buildings 13.2
town 13
street 12.9
landmark 12.6
transportation 12.5
vacation 12.3
park 12.3
modern 11.9
old 11.8
road 11.7
traffic 11.4
tourist 11.3
office 11.3
balcony 11.3
landscape 11.2
night 10.7
highway 10.6
clouds 10.1
light 10.1
cityscape 9.5
transport 9.1
business 9.1
tower 8.9
downtown 8.6
glass 8.6
exterior 8.3
conveyance 8.2
device 8
window 8
holiday 7.9
sea 7.8
color 7.8
equipment 7.7
construction 7.7
automobile 7.7
wheeled vehicle 7.6
attraction 7.6
vehicle 7.6
bay 7.5
monument 7.5
ocean 7.5
famous 7.4
water 7.3
new 7.3
sun 7.2
home 7.2
history 7.2
river 7.1
day 7.1

Google
created on 2022-02-11

Sky 93.9
Water 92.9
Tree 85.3
Shade 83.6
Travel 82.9
Rectangle 82.7
Leisure 82.3
Plant 82.1
Urban design 78.7
Tints and shades 77.4
Metal 66.5
Lake 65.4
T-shirt 65
Room 64.5
Landscape 62.4
Handrail 58.8
Tourism 57.3
Square 57.1
Facade 57
Arch 55.7

Microsoft
created on 2022-02-11

person 90.9
text 87.5
sky 86.2
clothing 83
tree 77.3
water 76.5
vacation 64.1
lake 54.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 69-79
Gender Female, 82.1%
Calm 38.1%
Sad 28.6%
Happy 26.3%
Surprised 3.1%
Fear 1.4%
Angry 1.1%
Disgusted 0.9%
Confused 0.6%

AWS Rekognition

Age 38-46
Gender Female, 58.5%
Calm 87.1%
Sad 4.2%
Fear 3.2%
Surprised 1.9%
Confused 1.1%
Disgusted 0.9%
Happy 0.9%
Angry 0.7%

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Sad 89.1%
Calm 5.6%
Happy 2%
Disgusted 1.4%
Confused 1%
Angry 0.5%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 29-39
Gender Male, 74.8%
Sad 46%
Calm 40.3%
Happy 6.8%
Disgusted 2.4%
Fear 2%
Confused 1.2%
Angry 0.9%
Surprised 0.5%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 89%
Happy 5.9%
Sad 2.5%
Disgusted 0.7%
Confused 0.5%
Angry 0.5%
Surprised 0.4%
Fear 0.4%

AWS Rekognition

Age 19-27
Gender Male, 98.6%
Fear 36.8%
Calm 29.3%
Sad 23.9%
Disgusted 3.5%
Confused 2%
Happy 1.8%
Angry 1.4%
Surprised 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.9%
Person 98.2%
Person 98.2%
Person 97.3%
Person 95.5%
Person 94.7%
Person 93.1%
Person 85.9%
Person 83.8%
Person 76.1%
Person 61%
Person 55.1%
Person 53.9%
Person 49.2%

Text analysis

Amazon

ELA
KODYK
ELA eira
KODYK POLETA EIRN
eira
EIRN
POLETA

Google

IA
IA