Human Generated Data

Title

Untitled (girl playing with dolls' clothes)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17409

Human Generated Data

Title

Untitled (girl playing with dolls' clothes)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17409

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 88.5
Human 88.5
Clothing 87.1
Apparel 87.1
Floor 70.3
Face 70.2
Person 68.2
Suit 66.1
Coat 66.1
Overcoat 66.1
Furniture 63.2
Room 61
Indoors 61
Door 58.9

Clarifai
created on 2023-10-29

people 99.7
monochrome 98.3
wear 97.9
music 97.6
adult 95.9
art 95.7
musician 95.5
one 95.1
portrait 92.5
dress 90.6
guitar 89.2
black and white 87.1
singer 87
man 87
drum 86.2
room 86.2
two 86
outfit 83.6
theater 83.5
group 81.1

Imagga
created on 2022-02-26

shopping 50.4
cart 47.7
buy 36.6
shopping cart 33.4
sale 32.3
market 29.3
wheeled vehicle 28.7
supermarket 28.6
store 28.3
shop 26.8
retail 25.6
basket 25.6
trolley 23.6
buying 19.3
handcart 19.2
empty 18.9
tricycle 18.3
business 18.2
wheel 17.9
purchase 17.3
push 17.1
person 17
metal 16.9
container 16.8
man 16.8
3d 16.2
customer 15.2
vehicle 13.9
checkout 13.8
trade 13.4
people 13.4
conveyance 13.1
commerce 13.1
automaton 12.7
male 12
money 11.9
trading 11.7
commercial 11.3
object 11
finance 11
consumer 10.7
mall 10.7
chair 10.7
musical instrument 10.5
chrome 10.4
men 10.3
metallic 10.1
adult 10
pushcart 9.9
render 9.5
holiday 9.3
one 9
consumption 8.9
consumerism 8.8
e commerce 8.8
merchandise 8.8
symbol 8.7
grocery 8.7
carrying 8.7
black 8.4
old 8.4
device 8.3
transport 8.2
lifestyle 7.9
groceries 7.8
gift 7.7
human 7.5
cradle 7.4
drawing 7.3
present 7.3
furniture 7.2
celebration 7.2
smile 7.1
silver 7.1
steel 7.1
conceptual 7

Microsoft
created on 2022-02-26

text 94.5
person 92.7
black and white 82.6
clothing 80
drawing 72
sketch 65.4
statue 59

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 88.5%
Person 68.2%

Categories

Imagga

paintings art 92.9%
interior objects 6.7%

Captions

Microsoft
created on 2022-02-26

a person holding a gun 41.6%
a person playing a guitar 30.6%

Text analysis

Amazon

16
COVELA
KODVN

Google

16
16