Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16092.3

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16092.3

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.8
Human 99.8
Porch 98.6
Person 98.5
Patio 98.4
Person 97.9
Handrail 96.5
Banister 96.5
Person 96.2
Person 95.9
Person 91
Pergola 90.9
Railing 90.6
Person 90.3
Person 90.1
Person 79.2
Person 69.6
Person 62.3
Person 60.4
Outdoors 56.1
Person 49.1

Clarifai
created on 2023-10-29

people 99.5
street 99.2
beach 98.4
architecture 98.3
girl 98
travel 97.9
city 97.9
summer 96.5
sea 96.3
man 96.2
tree 95.9
bridge 95.5
urban 95.3
wedding 95.2
light 95.2
water 95
ocean 94.8
hotel 94.5
sky 94.4
portrait 94.2

Imagga
created on 2022-02-11

structure 26.5
city 24.1
travel 21.8
building 21.4
urban 19.2
balcony 19.2
sky 18
park 17.9
barrier 17.2
architecture 17.1
vacation 14.7
transportation 14.3
bridge 13.6
car 12.7
obstruction 12.7
water 12.7
modern 12.6
sea 12.5
ocean 12.4
business 12.1
landscape 11.9
road 11.7
old 11.1
street 11
bay 10.4
transport 10
tourism 9.9
equipment 9.8
people 9.5
deck 9.4
glass 9.3
holiday 9.3
town 9.3
tract 9.3
outdoor 9.2
island 9.2
wheeled vehicle 9
window 9
device 8.9
steel 8.8
area 8.7
industry 8.5
trip 8.5
journey 8.5
inclined plane 8.4
summer 8.4
machine 8.3
silhouette 8.3
man 8.1
sun 8
light 8
vehicle 7.8
ship 7.7
outside 7.7
construction 7.7
office 7.6
clouds 7.6
beach 7.6
new 7.3
sunset 7.2
trees 7.1
interior 7.1
step 7

Google
created on 2022-02-11

Water 96.1
Sky 94.4
Plant 91.3
Rectangle 87.6
Tree 86.6
Cloud 86.3
Shade 84
Travel 83.4
Building 83.2
Urban design 80.6
Leisure 79.7
Adaptation 79.4
Tints and shades 77.4
Lake 76.6
Landscape 68.5
Metal 67.4
Room 66.2
T-shirt 66.1
Tourism 60.4
Roof 59

Microsoft
created on 2022-02-11

tree 99
outdoor 96.2
text 94.8
water 91
lake 90.4
sky 85.7
person 83.8
clothing 72.7
vacation 71

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 88.2%
Happy 76.6%
Sad 14.5%
Calm 6.7%
Angry 0.6%
Disgusted 0.6%
Surprised 0.4%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 41-49
Gender Male, 72.2%
Calm 56.1%
Sad 34%
Surprised 2.5%
Confused 2.2%
Fear 2.2%
Angry 1.4%
Disgusted 1.1%
Happy 0.6%

AWS Rekognition

Age 34-42
Gender Male, 75%
Sad 71.7%
Disgusted 24%
Confused 1.1%
Angry 0.9%
Calm 0.7%
Happy 0.7%
Fear 0.6%
Surprised 0.3%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 83.8%
Sad 12.9%
Confused 1.1%
Angry 0.8%
Happy 0.5%
Disgusted 0.5%
Fear 0.3%
Surprised 0.1%

AWS Rekognition

Age 21-29
Gender Male, 97.7%
Calm 96.7%
Happy 0.8%
Sad 0.7%
Angry 0.6%
Disgusted 0.5%
Fear 0.3%
Surprised 0.2%
Confused 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Calm 93%
Sad 3.4%
Confused 1.2%
Happy 0.9%
Angry 0.5%
Surprised 0.5%
Disgusted 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.8%
Person 98.5%
Person 97.9%
Person 96.2%
Person 95.9%
Person 91%
Person 90.3%
Person 90.1%
Person 79.2%
Person 69.6%
Person 62.3%
Person 60.4%
Person 49.1%

Categories

Text analysis

Amazon

:
L :
L