Human Generated Data

Title

Untitled (man teaching girl to ride bike; boy looking on)

Date

1956

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14959

Human Generated Data

Title

Untitled (man teaching girl to ride bike; boy looking on)

People

Artist: Jack Gould, American

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.8
Person 99.8
Bicycle 99.7
Bike 99.7
Vehicle 99.7
Transportation 99.7
Person 99.5
Person 99.4
Machine 99.3
Wheel 99.3
Person 94.7
Cyclist 87.3
Sport 87.3
Sports 87.3
Person 86.4
Wheel 79.9
People 65.5
Floor 64.9
Text 58.8

Imagga
created on 2022-03-05

wheelchair 38
wheeled vehicle 37.7
bicycle 32.9
sport 32.7
tricycle 32.4
bike 30.2
man 26.9
vehicle 26.5
people 26.2
chair 25.5
active 23.6
person 22.9
seat 20.2
exercise 20
ball 19.5
male 19.2
conveyance 18.7
planner 18.7
cycle 18.5
athlete 18.3
adult 17.5
competition 17.4
recreation 17
outdoors 15.7
men 15.4
fitness 15.4
lifestyle 15.2
player 14.9
unicycle 14.7
wheel 14.1
city 14.1
transport 13.7
fun 13.5
transportation 13.4
basketball 13.4
leisure 13.3
health 13.2
urban 13.1
action 12
sports 12
training 12
street 12
speed 11.9
outdoor 11.5
play 11.2
old 11.1
outside 11.1
furniture 10.9
cycling 10.8
equipment 10.8
activity 10.7
cyclist 10.7
game 10.7
day 10.2
playing 10
silhouette 9.9
tennis 9.7
court 9.7
body 9.6
boy 9.6
net 9.5
healthy 9.4
motion 9.4
window 9.4
support 9.3
help 9.3
team 9
riding 8.8
happy 8.8
match 8.7
high 8.7
energy 8.4
fit 8.3
teen 8.3
human 8.2
aged 8.1
road 8.1
couple 7.8
black 7.8
ride 7.8
retired 7.8
summer 7.7
exercise bike 7.7
gym 7.6
device 7.5
senior 7.5
friendship 7.5
sports equipment 7.4
teenager 7.3
smiling 7.2
to 7.1
life 7

Microsoft
created on 2022-03-05

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.5%
Calm 92.4%
Surprised 3.7%
Confused 1.9%
Sad 0.6%
Disgusted 0.4%
Fear 0.4%
Angry 0.4%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Male, 99.9%
Confused 51.5%
Calm 42.1%
Surprised 2.4%
Sad 2.1%
Disgusted 0.9%
Happy 0.5%
Angry 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Bicycle 99.7%
Wheel 99.3%

Captions

Microsoft

a group of people standing in front of a building 88.9%
a person standing in front of a building 88.8%
a man and a woman standing in front of a building 76.8%

Text analysis

Amazon

KODVK
kirw
KODVK 20VEELA kirw
له
20VEELA

Google

KODVK
2.v
LEL
DVK 2.v LEL A LITW KODVK
A
DVK
LITW