Human Generated Data

Title

Untitled (woman shopping at outdoor flower market)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14889

Human Generated Data

Title

Untitled (woman shopping at outdoor flower market)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14889

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.3
Human 99.3
Clothing 98.7
Apparel 98.7
Person 98.6
Person 95.9
Robe 85.4
Fashion 85.4
Gown 78.9
Face 74.1
Portrait 65.5
Photography 65.5
Photo 65.5
People 62.9
Home Decor 62.3
Evening Dress 62
Bullfighter 59.3
Coat 58.9
Priest 58.1
Shoe 57.2
Footwear 57.2
Shoe 55.2

Clarifai
created on 2023-10-28

people 100
adult 99.3
group 99.1
one 98.5
two 97.9
group together 97.6
home 97
canine 96.9
monochrome 96.7
mammal 95.6
three 94.4
woman 94.4
dog 94.2
administration 93.4
many 92.1
portrait 91.5
several 91.4
child 91.3
street 90
wear 89.6

Imagga
created on 2022-01-29

television 43.1
newspaper 28.4
telecommunication system 28.4
product 21.8
man 21.5
people 18.4
old 18.1
office 17
male 17
laptop 16.9
creation 16.9
business 16.4
computer 15.7
working 15
businessman 15
work 14.9
person 14.3
monitor 13.6
daily 13.2
portrait 12.9
black 12.6
vintage 12.4
home 12
adult 11.7
looking 11.2
happy 10
house 10
technology 9.6
back 9.2
travel 9.1
indoor 9.1
one 9
family 8.9
screen 8.8
world 8.7
ancient 8.6
men 8.6
sitting 8.6
face 8.5
room 8.5
senior 8.4
center 8.2
gray 8.1
interior 8
smiling 8
love 7.9
serious 7.6
electronic equipment 7.4
retro 7.4
light 7.3
aged 7.2
smile 7.1
worker 7.1
job 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.1
clothing 90.4
black and white 87.9
person 86.7
wedding dress 63
store 44.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 98.8%
Calm 97.6%
Confused 1.2%
Angry 0.5%
Disgusted 0.3%
Surprised 0.2%
Sad 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 27-37
Gender Male, 94.6%
Calm 87.6%
Sad 11.9%
Confused 0.3%
Happy 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Shoe
Person 99.3%
Person 98.6%
Person 95.9%
Shoe 57.2%
Shoe 55.2%

Categories

Text analysis

Amazon

STAG
BEEP
RENCO.
S
S RENCO. Or
Or
-
TaHrcov

Google

STAG BEER SRENCO, 7A
STAG
BEER
SRENCO,
7A