Human Generated Data

Title

Untitled (family standing in front of house)

Date

1920

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2143

Human Generated Data

Title

Untitled (family standing in front of house)

People

Artist: Hamblin Studio, American active 1930s

Date

1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2143

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Person 99.6
Person 99.6
Person 99.5
Person 99.3
Person 99.2
Person 98.7
Person 98.3
Nature 96.8
Person 95.3
Outdoors 93.8
Building 81.3
Housing 81.3
Person 78.8
Snow 75.2
Tree 73.7
Plant 73.7
House 66.1
Home Decor 65.2
Winter 61.3
Storm 56.4
Weather 55.6
Mansion 55.3

Clarifai
created on 2023-10-25

home 99.4
house 99.2
people 98.8
building 96.7
porch 96.4
family 96.1
farmhouse 95.7
monochrome 95.6
man 95.1
tree 94.6
adult 92.4
infrared 92.4
architecture 92.2
group 90.8
street 88.7
fence 88.5
property 86
winter 85.1
wood 84.2
black and white 83.5

Imagga
created on 2021-12-14

picket fence 100
fence 88.5
barrier 63.7
building 52.5
structure 45.4
architecture 44.8
obstruction 42.8
balcony 34.7
house 32
home 21.6
old 20.2
construction 18.8
sky 17.9
landmark 17.2
city 16.6
roof 16.4
travel 16.2
history 15.2
estate 15.2
night 15.1
trees 14.2
historic 13.8
real 13.3
tourism 13.2
winter 12.8
religion 12.6
door 12.4
church 12
school 11.9
temple 11.6
residential 11.5
light 11.4
urban 11.4
brick 11.3
facade 11.3
exterior 11.1
snow 11
stone 11
columns 10.8
windows 10.6
famous 10.2
outdoor 9.9
government 9.8
property 9.7
country 9.7
ancient 9.5
town 9.3
tree 9.2
street 9.2
wood 9.2
residence 9.2
column 9
museum 9
style 8.9
rural 8.8
gate 8.7
entrance 8.7
dome 8.7
cold 8.6
united 8.6
statue 8.6
buildings 8.5
landscape 8.2
grass 7.9
railing 7.9
capital 7.6
window 7.6
traditional 7.5
monument 7.5

Google
created on 2021-12-14

Building 95
Window 93.1
Tree 90.2
Black 89.6
Rectangle 85.5
Plant 85
House 84.8
Fixture 81.8
Line 81.8
Tints and shades 77.4
Facade 75
Landscape 73.3
Monochrome 72.3
Art 71.6
Monochrome photography 71.1
Cottage 68
Room 66.5
Sky 66.4
Darkness 65.7
Siding 65.5

Microsoft
created on 2021-12-14

tree 99.1
outdoor 97.4
house 97
sky 87.3
black and white 75.3
building 75
text 64.4
old 55.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-61
Gender Male, 98.1%
Calm 65.8%
Happy 32.6%
Sad 0.8%
Angry 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 32-48
Gender Male, 86%
Calm 95.2%
Sad 2.8%
Happy 0.9%
Confused 0.6%
Surprised 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 57.4%
Calm 63.1%
Happy 20.2%
Sad 4.9%
Confused 4.8%
Angry 3.7%
Surprised 1.6%
Disgusted 1.1%
Fear 0.6%

AWS Rekognition

Age 26-40
Gender Male, 91.5%
Calm 74.9%
Happy 16.6%
Angry 3.3%
Surprised 1.7%
Sad 1.2%
Confused 1.2%
Disgusted 0.9%
Fear 0.2%

AWS Rekognition

Age 19-31
Gender Female, 60.4%
Calm 72.3%
Happy 21%
Confused 2%
Angry 1.5%
Sad 1.3%
Surprised 1.1%
Disgusted 0.6%
Fear 0.3%

AWS Rekognition

Age 28-44
Gender Male, 98.8%
Calm 92.1%
Happy 4.2%
Sad 1.6%
Angry 0.9%
Confused 0.7%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 35-51
Gender Female, 74.1%
Calm 49.1%
Sad 23.7%
Angry 13.3%
Happy 5.5%
Confused 4.3%
Fear 2%
Surprised 1.1%
Disgusted 1%

AWS Rekognition

Age 56-74
Gender Female, 73.6%
Calm 69.7%
Happy 23.8%
Sad 5.1%
Confused 0.7%
Surprised 0.4%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 99.7%

Categories