Human Generated Data

Title

Untitled (two children with new toys in front of Christmas tree)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2958

Human Generated Data

Title

Untitled (two children with new toys in front of Christmas tree)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2958

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Person 99.1
Human 99.1
Person 97.2
Chair 96.6
Furniture 96.6
Plant 89.1
Wheel 87.9
Machine 87.9
Indoors 86.3
Clothing 86.3
Apparel 86.3
Room 85
Tree 83.1
Living Room 75.5
Portrait 69.1
Face 69.1
Photography 69.1
Photo 69.1
Fir 60.9
Abies 60.9
Flooring 57.4
Door 57.3
Christmas Tree 57.3
Ornament 57.3
Kid 56.1
Child 56.1

Clarifai
created on 2023-10-26

people 99.6
winter 99
Christmas 98.5
child 98.4
monochrome 98.1
tree 96.2
christmas tree 96
snow 95.9
group 95.7
boy 95.5
girl 95.2
street 94.6
family 93.7
woman 93.4
group together 93
room 93
man 91.5
son 88.9
two 88.7
many 86.3

Imagga
created on 2022-01-21

blackboard 24.2
musical instrument 16.6
newspaper 15.8
black 14.4
sax 14.1
tree 13.9
chair 12.9
man 12.8
wind instrument 12.5
product 12.4
old 11.8
snow 11.7
light 10.7
window 10.4
cold 10.3
season 10.1
holiday 10
wood 10
creation 9.7
landscape 9.7
forest 9.6
decoration 9.5
person 9.5
day 9.4
device 9.4
building 9.4
male 9.2
dark 9.2
bench 9
room 8.6
winter 8.5
business 8.5
design 8.4
park 8.4
people 8.4
house 8.4
city 8.3
sky 8.3
brass 8.2
sun 8
color 7.8
grunge 7.7
ice 7.6
power 7.6
poster 7.5
silhouette 7.4
stringed instrument 7.3
indoor 7.3
glass 7.3
aged 7.2
art 7.2
home 7.2
night 7.1
interior 7.1
businessman 7.1
accordion 7
modern 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 86.3%
Calm 74.9%
Fear 14.4%
Sad 4.3%
Happy 2.3%
Surprised 1.6%
Confused 1.3%
Angry 0.6%
Disgusted 0.6%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 68.8%
Happy 29.7%
Sad 0.6%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%
Surprised 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Chair 96.6%
Wheel 87.9%

Categories

Text analysis

Amazon

as
KODAK-A

Google

KODVK- E
KODVK-
E