Human Generated Data

Title

Untitled (girl putting doll into cradle)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16406

Human Generated Data

Title

Untitled (girl putting doll into cradle)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16406

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 93.1
Human 93.1
Shop 64.2
Text 61.2
Clothing 57
Apparel 57

Clarifai
created on 2023-10-28

people 99.8
monochrome 99.2
man 96.3
adult 96.2
child 95.6
nostalgia 95
woman 93.2
retro 92.1
furniture 90.2
group 89.8
vintage 89.3
street 87.4
room 85.5
sit 85.2
chair 84.6
two 84.3
movie 83.3
vehicle 82.7
old 82
seat 81.3

Imagga
created on 2022-02-11

blackboard 100
dishwasher 32
white goods 24.9
shopping 24.8
cart 24.3
business 23.7
market 22.2
buy 20.6
home appliance 18.7
sale 17.6
drawing 16.3
design 16.3
appliance 15.2
store 15.1
supermarket 14.9
trolley 14.8
money 14.5
technology 14.1
shop 14
finance 13.5
sketch 13.3
plan 13.2
shopping cart 13.2
symbol 12.8
trade 12.4
3d 12.4
retail 12.3
man 12.1
grunge 11.9
idea 11.6
chart 11.5
hand 11.4
empty 11.2
paper 11
global 10.9
house 10.9
diagram 10.5
metal 10.5
push 10.4
data 10
financial 9.8
sign 9.8
basket 9.7
purchase 9.6
graph 9.6
map 9.6
home 9.6
customer 9.5
object 9.5
construction 9.4
architecture 9.4
stock 9.3
male 9.2
science 8.9
success 8.8
chalkboard 8.8
computer 8.8
conceptual 8.8
building 8.7
buying 8.7
education 8.7
commercial 8.4
people 8.4
metallic 8.3
person 8.3
cash 8.2
one 8.2
digital 8.1
businessman 7.9
trading 7.8
render 7.8
nobody 7.8
architect 7.7
wall 7.7
project 7.7
texture 7.6
wheel 7.5
pattern 7.5
commerce 7.5
dollar 7.4
style 7.4
car 7.4
board 7.2
decoration 7.2
office 7.2
world 7.2
art 7.2
work 7.1
modern 7

Microsoft
created on 2022-02-11

text 99.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-12
Gender Female, 96.3%
Calm 60.3%
Sad 18%
Happy 13.9%
Angry 2.6%
Surprised 2.1%
Confused 1.2%
Disgusted 1.2%
Fear 0.8%

Feature analysis

Amazon

Person
Person 93.1%

Categories

Captions

Microsoft
created on 2022-02-11

an old photo of a person 36.5%

Text analysis

Amazon

56
Joe
KODVK-COVEETA

Google

くTヨヨA-Aaoy aロ」
T
ヨヨ
A
-
Aaoy
a