Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (woman in green coat)

Date

1969

People

Artist: Eugene Dwiggins, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Robert M. Sedgwick II Fund, 2.2002.2114

Human Generated Data

Title

Untitled (woman in green coat)

People

Artist: Eugene Dwiggins, American 20th century

Date

1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Robert M. Sedgwick II Fund, 2.2002.2114

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.8
Apparel 99.8
Person 98.5
Human 98.5
Silhouette 92.9
Overcoat 90.1
Coat 90.1
Art 79.4
Suit 74.7
Poster 60.9
Advertisement 60.9
Floor 57.1
Label 55.9
Text 55.9

Clarifai
created on 2023-10-29

art 99.5
people 99.4
one 98.2
painting 98.1
adult 97.7
man 97.2
woman 97.1
wear 96.8
illustration 96.1
portrait 94.5
girl 93.2
shadow 93
silhouette 92.1
action 92
square 90.5
vector 90
city 89.6
music 89.4
sunset 89.2
street 89.2

Imagga
created on 2022-02-26

device 32.4
fire extinguisher 32.1
bottle 15.3
glass 15
container 13.5
alcohol 12.9
drink 12.5
leaves 12.2
holiday 11.5
plant 11.2
newspaper 10.9
leaf 10.9
flower 10.8
water 10.7
beverage 9.9
juice 9.4
kimono 9.2
wet 8.9
closeup 8.7
insect 8.7
lime 8.7
party 8.6
product 8.5
old 8.4
freshness 8.3
robe 8.3
building 8.3
present 8.2
close 8
celebration 8
gift 7.7
wall 7.7
cocktail 7.7
lemon 7.5
frame 7.5
fun 7.5
vessel 7.4
bar 7.4
yellow 7.3
valentine 7.3
drop 7.2
color 7.2
person 7.2
transparent 7.2
male 7.1
travel 7

Google
created on 2022-02-26

Rectangle 87.1
Sleeve 86.7
Tints and shades 77.3
Art 75.7
Visual arts 68.1
Room 62.5
Fashion design 61.3
Shadow 58.1
Vintage clothing 55.9
Font 54.4
Knee 53.6
Square 52
Magenta 51.4

Microsoft
created on 2022-02-26

gallery 99.9
room 99.9
scene 99.8
wall 98.7
person 91.4
clothing 87.6
art 77
drawing 69.7
text 65
painting 32.7

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.5%

Categories

Imagga

paintings art 94.3%
food drinks 3.4%

Captions

Microsoft
created on 2022-02-26

a painting of a man 84%
a painting of a man in a room 83.9%
a painting of a man 76.7%