Human Generated Data

Title

Untitled (three children at Christmas tea set)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.461

Human Generated Data

Title

Untitled (three children at Christmas tea set)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Wheel 99.7
Machine 99.7
Plant 99.6
Tree 99.6
Human 99.2
Person 99.2
Person 97.7
Wheel 97.6
Person 95
Bicycle 89.8
Transportation 89.8
Vehicle 89.8
Bike 89.8
Bicycle 87.3
Ornament 86.7
Wheel 84.9
Christmas Tree 69.1
Person 62.2

Imagga
created on 2022-01-23

vehicle 100
wheeled vehicle 100
tricycle 100
conveyance 72.4
old 25.8
bicycle 21.7
bike 21.5
house 20
cycle 15.6
architecture 14.8
window 14.6
city 14.1
building 13.5
travel 12.7
vintage 12.4
wheel 12.3
black 12
wall 12
transport 11.9
home 11.2
transportation 10.8
man 10.7
sport 10.7
urban 10.5
decoration 10.1
tree 10.1
aged 9.9
park 9.9
pedal 9.9
cold 9.5
outside 9.4
snow 9.4
winter 9.4
street 9.2
historic 9.2
retro 9
color 8.9
interior 8.8
ride 8.7
antique 8.6
holiday 8.6
season 8.6
grunge 8.5
person 8.4
rural 7.9
autumn 7.9
male 7.8
ancient 7.8
texture 7.6
age 7.6
wood 7.5
tourism 7.4
plants 7.4
room 7.3
tourist 7.2
dirty 7.2
road 7.2
active 7.2

Google
created on 2022-01-23

Wheel 91.6
Plant 91
Chair 86.2
Tree 83.9
Picture frame 81.2
Tints and shades 77.4
Art 74.1
Vintage clothing 70.1
Monochrome 69.7
Room 68.7
Child 67.9
Musical instrument 67.3
Tire 67.3
Wood 65.9
Table 65.6
Stock photography 65.3
House 64.1
Classic 61.8
Conifer 61.5
Sitting 61.1

Microsoft
created on 2022-01-23

text 95
outdoor 90
old 85.7
christmas tree 63.4
person 57
vintage 36.9

Face analysis

Amazon

Google

AWS Rekognition

Age 1-7
Gender Female, 100%
Calm 74.1%
Fear 13.9%
Sad 6.8%
Surprised 1.8%
Happy 1%
Angry 1%
Confused 0.9%
Disgusted 0.6%

AWS Rekognition

Age 0-3
Gender Female, 96%
Calm 89%
Sad 8%
Confused 1%
Surprised 0.7%
Angry 0.4%
Disgusted 0.3%
Fear 0.3%
Happy 0.3%

AWS Rekognition

Age 23-31
Gender Male, 90.6%
Happy 49.5%
Sad 34%
Calm 9.3%
Fear 2.6%
Surprised 2.3%
Confused 0.9%
Disgusted 0.9%
Angry 0.5%

AWS Rekognition

Age 19-27
Gender Female, 53%
Calm 57.5%
Sad 40%
Confused 0.9%
Happy 0.4%
Fear 0.3%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.7%
Person 99.2%
Bicycle 89.8%

Captions

Microsoft

a vintage photo of a person riding a motorcycle in front of a building 71.9%
a vintage photo of a person riding on the back of a motorcycle 67.4%
a vintage photo of a person riding a motorcycle 67.3%