Human Generated Data

Title

Untitled (girl with Christmas toys)

Date

c. 1955, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.193

Human Generated Data

Title

Untitled (girl with Christmas toys)

People

Artist: Martin Schweig, American 20th century

Date

c. 1955, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.193

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 95.9
Human 95.9
Person 92.6
Bedroom 91
Room 91
Indoors 91
Wheel 78.2
Machine 78.2
Living Room 75.4
Face 74.6
Dorm Room 72.9
Furniture 72.8
Person 67.9
Table 65.1
Female 64.3
Sewing 63.7
Girl 60.5
Workshop 58.9
People 58.2
Electronics 57.5
Screen 57.5
Baby 56.1
Monitor 56
Display 56
Bed 55.3

Clarifai
created on 2023-10-25

people 100
wear 98.9
adult 98.6
woman 98.6
two 97.7
monochrome 97.5
group 96.6
group together 95.8
three 94.8
vintage 94.2
one 94.1
child 92.4
man 90
retro 89.3
artisan 89.1
black and white 87.6
nostalgia 87.5
room 87.3
chair 87.3
art 87.2

Imagga
created on 2022-01-08

seller 39.1
handcart 24.2
shopping cart 23.4
work 19.7
wicker 19.3
chair 19.1
person 18.8
table 18.6
wheeled vehicle 18
lifestyle 15.9
people 15.6
man 15.4
product 15.4
container 14.8
smiling 14.5
outdoor 13.8
basket 13.3
sitting 12.9
furniture 12.7
outdoors 12.7
cheerful 12.2
computer 12.2
home 12
business 11.5
seat 11.5
male 11.3
happy 11.3
laptop 11.1
outside 11.1
creation 11.1
adult 11
interior 10.6
iron 10.1
shopping basket 9.9
pretty 9.8
one 9.7
indoors 9.7
room 9.6
boy 9.6
happiness 9.4
garden 9.3
smile 9.3
musical instrument 9.3
shop 9.1
park 9.1
technology 8.9
chairs 8.8
patio 8.7
barrow 8.5
black 8.4
attractive 8.4
relaxation 8.4
leisure 8.3
holding 8.2
brown 8.1
newspaper 8.1
women 7.9
urban 7.9
day 7.8
standing 7.8
child 7.7
modern 7.7
communication 7.6
house 7.5
wood 7.5
coffee 7.4
food 7.3
metal 7.2
job 7.1
working 7.1
businessman 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.5
table 97.4
furniture 95.3
clothing 93.7
person 92.5
woman 90.4
chair 86.4
black and white 81.5
human face 69.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-10
Gender Female, 85.3%
Calm 94.4%
Surprised 2.9%
Fear 0.9%
Confused 0.7%
Sad 0.5%
Disgusted 0.2%
Happy 0.2%
Angry 0.2%

AWS Rekognition

Age 13-21
Gender Female, 100%
Calm 48.5%
Surprised 25.4%
Sad 14.2%
Confused 4.4%
Disgusted 2.7%
Angry 2.4%
Fear 2.1%
Happy 0.4%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 95.9%
Wheel 78.2%

Categories

Imagga

paintings art 90.8%
people portraits 8.6%

Captions

Microsoft
created on 2022-01-08

a woman sitting on a table 72.7%
a woman sitting at a table 72.6%
a woman sitting in a chair 72.5%

Text analysis

Google

Ad
Ad