Human Generated Data

Title

Untitled (girl with hula hoop)

Date

1962

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16526

Human Generated Data

Title

Untitled (girl with hula hoop)

People

Artist: Lucian and Mary Brown, American

Date

1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16526

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Person 99.7
Human 99.7
Person 99.5
Person 98.5
Person 97.3
Jaw 77.2
Leisure Activities 67
Dance Pose 56.7
Clothing 56.2
Apparel 56.2
Sphere 56
Chair 55.4
Furniture 55.4

Clarifai
created on 2023-10-29

science 99.1
future 98.2
research 98
people 97.9
technology 97.6
spherical 97.6
internet 97.5
virtual 97.5
man 97.1
futuristic 96.8
interaction 96.6
woman 96.6
interface 96.5
touch 96.1
discovery 95.3
connection 95.2
option 94.7
ball-shaped 94.6
computer 94.5
healthcare 94.5

Imagga
created on 2022-02-12

people 24
person 22.1
tennis 20.5
racket 19.4
adult 19
lifestyle 17.3
professional 17.3
male 16.3
man 16.1
pretty 16.1
ball 15.6
sport 15.3
portrait 14.9
player 14.9
smiling 14.5
happy 14.4
equipment 14.3
smile 14.2
women 14.2
exercise 13.6
active 13.5
health 13.2
office 13
sexy 12.8
fitness 12.6
attractive 12.6
activity 12.5
face 12.1
healthy 12
competition 11.9
hair 11.9
technology 11.9
holding 11.5
business 11.5
device 11.5
court 10.7
game 10.7
medical 10.6
human 10.5
computer 10.4
looking 10.4
play 10.3
casual 10.2
modern 9.8
science 9.8
net 9.7
fit 9.2
fun 9
medicine 8.8
body 8.8
specialist 8.6
men 8.6
occupation 8.2
care 8.2
working 7.9
microphone 7.9
match 7.7
hospital 7.6
athlete 7.6
communication 7.6
stethoscope 7.5
computerized axial tomography scanner 7.5
one 7.5
lovely 7.1

Google
created on 2022-02-12

Microsoft
created on 2022-02-12

text 89.3
person 85.3
window 83.5
clothing 83.3
cartoon 72
black and white 64.4
posing 59.6
mirror 57.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 87.3%
Happy 42.7%
Sad 26.3%
Confused 18%
Surprised 5.6%
Disgusted 2.2%
Fear 1.9%
Calm 1.7%
Angry 1.6%

AWS Rekognition

Age 50-58
Gender Female, 61.3%
Happy 87.6%
Calm 5.2%
Sad 3.7%
Confused 1.5%
Disgusted 0.8%
Surprised 0.5%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 26-36
Gender Male, 91.3%
Happy 42.4%
Surprised 20.7%
Sad 16.7%
Calm 10.2%
Confused 4.4%
Disgusted 2.4%
Fear 1.6%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.5%
Person 98.5%
Person 97.3%

Categories

Imagga

interior objects 97.3%
paintings art 2.4%

Captions

Microsoft
created on 2022-02-12

a man and a woman standing in front of a window 32.6%

Text analysis

Amazon

5
O
pulmo
pulmo nu
nu
O omuc
no
1280
1280 Ecolomic
Ecolomic
popaluo
popaluo quifot
für
quifot
omuc
pocpy
teamio
Leachnum
labonon
Leachnum cuporen
ichicreato
cuporen
DE

Google

MJI7-- Y T37A°2 - - XAGOX
MJI7--
Y
T37A°2
-
XAGOX