Human Generated Data

Title

Untitled (four girls on camp bunk beds)

Date

1952

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14556

Human Generated Data

Title

Untitled (four girls on camp bunk beds)

People

Artist: Jack Gould, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14556

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Furniture 99.9
Person 96.9
Human 96.9
Bunk Bed 96.4
Person 80.1
Person 73.8
Bed 73.3
Person 69.6
Bed 68.9
Person 68
Person 61.1
Drawing 55.8
Art 55.8
Bedroom 55.5
Indoors 55.5
Room 55.5
Table 55
Bed 51.8

Clarifai
created on 2023-10-29

furniture 99.3
people 99
chair 95.4
no person 95.2
adult 93.9
monochrome 93.1
man 92.9
room 92.8
seat 92.6
indoors 91.6
group 90.7
group together 90.7
two 90.3
many 89.9
woman 89.2
family 89.2
shelf 87.6
inside 86.3
child 84.5
wear 82.4

Imagga
created on 2022-01-29

chair 31.4
furniture 24.9
interior 23.9
seat 23.2
equipment 22.3
room 21.6
barber chair 21.4
shop 21.2
machine 17.5
modern 16.8
window 16.7
house 15.9
device 15.2
mercantile establishment 14.9
table 14.2
inside 13.8
industry 13.6
home 13.5
factory 12.8
transportation 12.5
wood 11.7
building 11.6
hospital 11.5
loom 11.5
steel 11.5
lamp 11.4
man 11.4
indoors 11.4
architecture 11
place of business 10.4
train 9.8
metal 9.6
textile machine 9.5
light 9.3
glass 9.3
3d 9.3
vehicle 9.3
electric chair 9.2
barbershop 9.2
case 9.1
old 9
station 8.8
apartment 8.6
luxury 8.6
passenger 8.5
business 8.5
travel 8.4
floor 8.4
incubator 8.3
city 8.3
office 8.2
industrial 8.2
new 8.1
decor 7.9
apparatus 7.9
design 7.9
urban 7.9
life 7.8
people 7.8
structure 7.8
bakery 7.7
men 7.7
gas 7.7
wall 7.7
instrument of execution 7.7
health 7.6
truck 7.6
power 7.5
bedroom 7.3
instrument 7.1
wooden 7

Microsoft
created on 2022-01-29

text 96.6
black and white 66.5
furniture 55.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Female, 60.9%
Sad 44.7%
Calm 30.9%
Angry 6.4%
Happy 5.7%
Disgusted 4.7%
Confused 3.9%
Surprised 1.9%
Fear 1.7%

AWS Rekognition

Age 21-29
Gender Female, 99.5%
Calm 57.3%
Sad 42.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 75.6%
Calm 88.2%
Sad 5.9%
Happy 2.6%
Surprised 1.9%
Confused 0.7%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

Feature analysis

Amazon

Person
Bed
Person 96.9%
Person 80.1%
Person 73.8%
Person 69.6%
Person 68%
Person 61.1%
Bed 73.3%
Bed 68.9%
Bed 51.8%

Categories

Imagga

interior objects 98.1%
paintings art 1.6%

Text analysis

Amazon

Rinso
5
M_117
M_117 YT3RAS ACHA
YT3RAS
ACHA

Google

MJ17 YT33A2 A
MJ17
YT33A2
A