Human Generated Data

Title

Untitled (three children playing with train set)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6988

Human Generated Data

Title

Untitled (three children playing with train set)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6988

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Furniture 99.6
Person 99.2
Human 99.2
Table 97.1
Person 94.4
Room 93.7
Indoors 93.7
Desk 90.1
Person 88.1
Chair 81.3
Workshop 75.4
Bedroom 66.3
Housing 66.1
Building 66.1
Tabletop 65.9
Worker 61.5
Wheel 61.2
Machine 61.2
Electronics 60.5
Person 60.3
Dining Table 57.8
Screen 56.2
Lab 55.6

Clarifai
created on 2023-10-27

people 99.9
group together 98.7
group 98
monochrome 97.5
street 96
man 95.6
adult 94.9
two 94.2
three 93.8
many 91.7
furniture 91.2
canine 90.8
dog 90.4
boy 90.1
several 89.5
five 89.2
four 88.1
woman 87.4
child 87.1
recreation 84.8

Imagga
created on 2022-01-23

center 30
room 26.7
business 26.1
table 23.8
people 22.8
interior 22.1
man 18.8
working 17.7
desk 17.3
person 15.8
work 15.7
male 14.9
indoors 14
team 13.4
chair 13.4
classroom 13.1
modern 12.6
office 12.6
house 12.5
businessman 12.3
meeting 12.2
adult 11.9
building 11.9
businesswoman 11.8
professional 11.4
happy 11.3
kitchen 11.1
engineer 10.9
worker 10.9
job 10.6
new 10.5
group 10.5
home 10.4
corporate 10.3
teamwork 10.2
furniture 10.1
executive 10.1
smiling 10.1
lifestyle 10.1
glass 10.1
city 10
technology 9.6
men 9.4
restaurant 9.2
inside 9.2
architecture 8.7
businesspeople 8.5
dinner 8.4
portrait 8.4
manager 8.4
floor 8.4
service 8.3
successful 8.2
computer 8.2
equipment 8
together 7.9
design 7.9
boy 7.8
luxury 7.7
board 7.6
decoration 7.4

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.6
person 90.5
building 60.3
black and white 57.9
clothing 56.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Male, 66.1%
Calm 92.4%
Happy 4.1%
Sad 1.4%
Confused 0.6%
Angry 0.5%
Surprised 0.4%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 1-7
Gender Female, 93.2%
Calm 98.8%
Sad 0.8%
Happy 0.2%
Fear 0.1%
Surprised 0.1%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 13-21
Gender Male, 99.5%
Sad 99.8%
Calm 0.1%
Confused 0%
Angry 0%
Fear 0%
Disgusted 0%
Happy 0%
Surprised 0%

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Wheel
Person 99.2%
Person 94.4%
Person 88.1%
Person 60.3%
Chair 81.3%
Wheel 61.2%

Categories

Imagga

interior objects 96.4%
paintings art 1.8%