Human Generated Data

Title

Untitled (woman modeling red, white, and blue outfit in indoor rainforest)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15527

Human Generated Data

Title

Untitled (woman modeling red, white, and blue outfit in indoor rainforest)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 100
Clothing 100
Human 97.7
Person 97.7
Fashion 94.6
Cloak 92.2
Cape 85.8
Poncho 60.4

Imagga
created on 2022-03-05

batting cage 47
baseball equipment 37.7
sports equipment 32.6
adult 27.2
person 24.9
sport 24.7
people 23.4
attractive 23.1
happy 21.9
model 21
portrait 20
fashion 19.6
man 18.8
male 18.5
lifestyle 18.1
lady 17
hair 16.6
smiling 16.6
standing 16.5
cute 16.5
casual 16.1
posing 15.1
smile 15
park 14.8
brunette 14.8
pretty 14.7
sexy 14.4
clothing 13.7
youth 13.6
human 13.5
boy 13
expression 12.8
face 12.8
women 12.6
jeans 12.4
shirt 12.3
guy 11.5
device 11.4
body 11.2
style 11.1
teen 11
business 10.9
elevator 10.5
outside 10.3
professional 10.2
happiness 10.2
black 10.2
exercise 10
outdoor 9.9
modern 9.8
cheerful 9.7
outdoors 9.7
couple 9.6
corporate 9.4
call 9.3
long 9.2
leisure 9.1
swing 8.9
garment 8.8
looking 8.8
child 8.8
look 8.8
spring 8.6
suit 8.5
tree 8.5
lifting device 8.4
lips 8.3
fun 8.2
teenager 8.2
fitness 8.1
active 8.1
building 7.9
summer 7.7
dream 7.6
holding 7.4
gorgeous 7.2
pose 7.2
stylish 7.2
dress 7.2
blond 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

tree 98.5
clothing 96.7
text 89.4
person 88.6
outdoor 87.5
footwear 86.9
woman 84.9
dress 63.9
fashion 62.3
human face 50.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Female, 100%
Happy 97.5%
Confused 0.5%
Surprised 0.5%
Angry 0.4%
Fear 0.4%
Disgusted 0.4%
Sad 0.2%
Calm 0.1%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%

Captions

Microsoft

a man standing on a bridge 41.8%
a man standing in front of a building 41.7%
a man standing next to a bridge 37.9%

Text analysis

Amazon

KODYK
MAGOM
LIFE

Google

JI1
MAGOX
I133
XAGOX
MAGOX JI1 I133 A2 XAGOX
A2