Human Generated Data

Title

French Market Group, New Orleans

Date

c. 1940

People

Artist: Joseph Woodson Whitesell, American 1876 - 1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1706

Human Generated Data

Title

French Market Group, New Orleans

People

Artist: Joseph Woodson Whitesell, American 1876 - 1958

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.7
Human 99.7
Person 99.7
Person 99.7
Person 99.6
Person 99.4
Person 99.1
Person 98.9
Person 93.6
Text 88.5
Person 88.3
Art 67.9
Painting 67.9
Outdoors 64
Musical Instrument 61.8
Clothing 59.2
Apparel 59.2
Envelope 59.1
Mail 59.1
Postcard 55

Imagga
created on 2022-01-14

sax 92.7
wind instrument 44.9
musical instrument 32.8
man 28.2
brass 27.1
male 23.4
cornet 22.5
people 20.6
person 20.4
adult 19.2
silhouette 16.5
music 15.6
men 12.9
couple 12.2
musician 12.2
business 11.5
dark 10.9
black 10.8
businessman 10.6
style 10.4
suit 9.9
concert 9.7
performance 9.6
play 9.5
room 9.4
light 9.4
guitar 9.1
playing 9.1
stage 9.1
fashion 9
human 9
sunset 9
device 9
night 8.9
group 8.9
office 8.8
women 8.7
chair 8.7
musical 8.6
passion 8.5
percussion instrument 8.4
attractive 8.4
hand 8.4
window 8.2
dirty 8.1
symbol 8.1
sexy 8
body 8
interior 8
love 7.9
model 7.8
portrait 7.8
modern 7.7
dance 7.6
leisure 7.5
holding 7.4
protection 7.3
lifestyle 7.2
performer 7.2
clothing 7.2
handsome 7.1
working 7.1
sky 7
together 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 99.6
person 97
clothing 93.9
outdoor 92.4
man 88.2
music 70.5
old 67.7

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 98.3%
Sad 1.2%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 99.5%
Sad 55.6%
Calm 41.3%
Angry 1.5%
Surprised 0.5%
Confused 0.3%
Fear 0.3%
Disgusted 0.3%
Happy 0.3%

AWS Rekognition

Age 22-30
Gender Female, 57.2%
Fear 30.3%
Sad 28.7%
Calm 20.2%
Happy 9.8%
Disgusted 4.6%
Surprised 2.7%
Angry 2.7%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.7%
Painting 67.9%

Captions

Microsoft

a vintage photo of a group of people standing in front of a crowd 82.4%
a vintage photo of a group of people around each other 81.8%
a vintage photo of a group of people in front of a crowd 81.7%

Text analysis

Amazon

trench
market
trench market Grouf 2. &
Grouf
2.
&
Whitentt
Whitentt 2.0.
2.0.

Google

trench uarket iouf Co. R.
uarket
iouf
R.
trench
Co.