Human Generated Data

Title

Untitled (three people and two dogs standing outside run down house)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3379

Human Generated Data

Title

Untitled (three people and two dogs standing outside run down house)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Building 100
Countryside 100
Rural 100
Shelter 100
Outdoors 100
Nature 100
Person 99.5
Human 99.5
Person 99.4
Yard 99.4
Person 98.8
Housing 96.2
Plant 96.2
Grass 96.2
Apparel 95.1
Clothing 95.1
Vegetation 94.6
Face 91.9
Female 86.3
Tree 84.5
House 82.9
Dress 79.7
Furniture 78
Chair 78
Villa 77.6
Land 75
Backyard 74.5
Urban 73.9
Porch 72.7
Dog 72.7
Pet 72.7
Mammal 72.7
Canine 72.7
Animal 72.7
Door 72
Girl 69
Pants 68.1
Shorts 68
Photography 67.5
Portrait 67.5
Photo 67.5
Jar 65.5
Vase 65.5
Pottery 65.5
Potted Plant 65.5
Woman 64.1
Dog 64.1
Standing 64.1
Kid 61.8
Child 61.8
Hut 57.8
Patio 57.5
Pool 56.2
Water 56.2
Swimming Pool 56.2

Imagga
created on 2022-01-22

shopping cart 100
handcart 100
wheeled vehicle 99.7
container 61.4
conveyance 34.2
sky 16.6
summer 15.4
chair 14.8
tree 14.6
house 13.4
old 13.2
outdoors 12.7
trees 12.4
water 12
outdoor 11.5
holiday 11.5
furniture 11.3
sun 11.3
empty 11.2
landscape 10.4
wood 10
city 10
building 9.6
seat 9.5
grass 9.5
rocking chair 9.4
dark 9.2
travel 9.1
park 9.1
forest 8.7
light 8.7
evening 8.4
shopping 8.3
home 8
rural 7.9
wooden 7.9
country 7.9
structure 7.9
sea 7.8
fence 7.7
winter 7.7
leisure 7.5
exterior 7.4
night 7.1
architecture 7

Google
created on 2022-01-22

Building 93.5
Plant 91.1
Black 89.7
Door 89.4
Black-and-white 84.6
Chair 83.5
House 79.6
Adaptation 79.2
Tree 78.2
Tints and shades 77.1
Monochrome 77
Monochrome photography 76
Cottage 72.8
Facade 71.7
Room 70
Porch 69.5
Art 68
Vintage clothing 67.4
History 64.5
Font 63.6

Microsoft
created on 2022-01-22

dog 97.5
building 93.3
outdoor 92
black and white 91
text 82.5
carnivore 78.7
animal 58

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 99.5%
Surprised 0.1%
Happy 0.1%
Confused 0.1%
Sad 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Female, 71.2%
Calm 59.2%
Sad 22.8%
Confused 9.9%
Fear 2.3%
Happy 2%
Disgusted 1.5%
Angry 1.3%
Surprised 1.1%

AWS Rekognition

Age 23-31
Gender Male, 96.9%
Calm 89.5%
Happy 9.3%
Sad 0.6%
Surprised 0.3%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 72.7%

Captions

Microsoft

a person standing in front of a building 71.1%
a person standing in front of a building 71%
a person standing in front of a door 68.6%

Text analysis

Amazon

PERINEN