Human Generated Data

Title

Untitled (children playing with blocks)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17911

Human Generated Data

Title

Untitled (children playing with blocks)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17911

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Play 99.7
Person 98.5
Human 98.5
Furniture 97.4
Chair 97.4
Person 96.1
Clothing 93
Apparel 93
Baby 92.1
Face 89.7
Indoors 89
Floor 88.9
Tabletop 88.6
Kid 84.7
Child 84.7
Table 80.9
Shorts 80.4
Boy 80.3
Portrait 77.2
Photography 77.2
Photo 77.2
Dress 74.6
Room 74.2
Female 71.9
Living Room 70.4
Sitting 69.5
Flooring 68.6
Meal 67.3
Food 67.3
Girl 66.2
Text 61.8
Play Area 61.6
Playground 61.6
Shoe 59
Footwear 59
Dining Table 56.8
Toy 55.7
Person 55.6

Clarifai
created on 2023-10-29

people 99.9
child 98.6
recreation 96.4
adult 96.4
group together 96.3
music 96.2
two 95.6
group 94.5
one 93.3
man 92.1
woman 92
three 91.9
musician 91.3
boy 89.6
wear 89.5
vehicle 87.3
sitting 85.7
guitar 85.7
furniture 84.3
leader 84.3

Imagga
created on 2022-03-04

person 32.9
man 30.5
people 29
computer 26.5
adult 24.7
male 24.2
office 24.2
work 22.9
working 22.1
indoors 22
business 21.2
checker 20.6
lifestyle 20.2
equipment 20.1
technology 20
sitting 19.7
laptop 19.6
hand 18.2
education 17.3
keyboard 17.1
table 17
home 16.7
senior 15.9
looking 15.2
men 14.6
game equipment 14.2
occupation 13.7
elderly 13.4
job 13.3
retirement 12.5
room 12.4
businessman 12.4
happy 11.9
casual 11.9
old 11.8
notebook 11.6
interior 11.5
desk 11.5
couple 11.3
professional 11.3
student 11.2
mature 11.2
paper 11
day 11
smiling 10.8
typing 10.7
studying 10.5
game 10.5
learning 10.3
women 10.3
communication 10.1
indoor 10
retired 9.7
class 9.6
monitor 9.5
corporate 9.4
musical instrument 9.4
finger 9.2
camera 9.2
active 9.2
businesswoman 9.1
color 8.9
newspaper 8.8
executive 8.7
bright 8.6
businesspeople 8.5
puzzle 8.4
portrait 8.4
modern 8.4
study 8.4
device 8.3
leisure 8.3
tabletop 8.3
percussion instrument 8.2
human 8.2
alone 8.2
playing 8.2
pensioner 8
mixer 7.9
concentration 7.7
busy 7.7
using 7.7
attractive 7.7
talking 7.6
learn 7.6
house 7.5
fun 7.5
one 7.5
floor 7.4
glasses 7.4
clothing 7.3
book 7.3
design 7.3
worker 7.2
family 7.1
crossword puzzle 7.1
chair 7.1
together 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

black and white 89.3
text 87.8
person 82.7
clothing 81.6
human face 71.3
toddler 59.7
child 52.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 64.3%
Surprised 84.8%
Calm 13%
Sad 1%
Angry 0.5%
Disgusted 0.3%
Fear 0.2%
Happy 0.1%
Confused 0.1%

AWS Rekognition

Age 39-47
Gender Male, 98.6%
Surprised 88.1%
Calm 9.6%
Fear 1%
Confused 0.6%
Sad 0.3%
Angry 0.2%
Disgusted 0.2%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.5%
Person 96.1%
Person 55.6%

Categories

Imagga

paintings art 99.4%

Captions

Text analysis

Amazon

N
ten
A
S
YT3RAS
MJI7
MJI7 YT3RAS ACSHA
ACSHA

Google

MJI7 YT3RA
MJI7
YT3RA