Human Generated Data

Title

Untitled (man and child looking at doll in radio-controlled toy car)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14846

Human Generated Data

Title

Untitled (man and child looking at doll in radio-controlled toy car)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14846

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.5
Human 99.5
Person 99.2
Person 97.4
Person 96.7
Clothing 94.2
Apparel 94.2
Shoe 86.3
Footwear 86.3
Face 79.6
Text 79.3
Person 75.9
Kid 69.4
Child 69.4
Female 67.5
Girl 66.8
People 66.5
Furniture 62
Home Decor 61.1
Photography 60.8
Photo 60.8
Baby 58.4
Hair 55.6

Clarifai
created on 2023-10-28

people 99.8
adult 97.5
woman 97
man 97
monochrome 96.8
group together 96.8
group 93.9
wear 93.7
child 92
furniture 91.5
chair 90.7
military 90.3
seat 90.2
vehicle 90
administration 90
sit 89.8
nostalgia 89.4
indoors 89
sitting 88.7
two 88.6

Imagga
created on 2022-01-29

monitor 29.2
work 25.9
newspaper 23.4
equipment 23.3
working 23
interior 22.1
computer 21.6
office 21.4
barbershop 21.2
shop 20.8
business 20.6
indoors 20.2
house 19.2
home 19.1
center 18.6
technology 18.6
window 18.2
room 17.9
product 17.3
man 16.1
people 15.1
desk 14.6
creation 14.5
network 14.1
architecture 14.1
modern 14
chair 13.8
furniture 13.7
mercantile establishment 13.5
table 13.5
male 13.5
device 13.1
occupation 12.8
building 12.7
laptop 12.3
data 11.9
adult 11.6
job 11.5
keyboard 11.4
industry 11.1
inside 11
electronic equipment 11
locker 10.6
server 10.4
person 10
worker 9.8
hospital 9.4
construction 9.4
light 9.4
place of business 9
design 9
engineer 8.8
steel 8.8
communications 8.6
glass 8.6
fastener 8.5
city 8.3
new 8.1
information 8
display 7.9
urban 7.9
corporate 7.7
sitting 7.7
men 7.7
factory 7.7
wall 7.7
comfortable 7.6
system 7.6
communication 7.6
machine 7.5
floor 7.4
safety 7.4
industrial 7.3
metal 7.2
call 7.2
kitchen 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 97
person 90.8
cartoon 82.4
black and white 70.4
screenshot 69.4
clothing 64.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 89.7%
Fear 85.5%
Sad 6%
Calm 5.1%
Happy 1.3%
Angry 0.7%
Surprised 0.7%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 23-31
Gender Female, 68.5%
Happy 64%
Calm 21%
Angry 5.7%
Surprised 4.5%
Disgusted 1.9%
Sad 1.4%
Fear 1.1%
Confused 0.5%

Feature analysis

Amazon

Person
Shoe
Person 99.5%
Person 99.2%
Person 97.4%
Person 96.7%
Person 75.9%
Shoe 86.3%

Categories

Captions

Microsoft
created on 2022-01-29

a group of people in a room 80.8%

Text analysis

Amazon

NAGON
MJI7 YT37A2 NAGON
YT37A2
MJI7
IDAD

Google

KODVK 2.LELA
KODVK
2.LELA