Human Generated Data

Title

Untitled (man and woman helping little child to walk)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17850

Human Generated Data

Title

Untitled (man and woman helping little child to walk)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 100
Apparel 100
Human 99.6
Person 99.6
Person 99.2
Coat 99
Canine 88.3
Dog 88.3
Animal 88.3
Pet 88.3
Mammal 88.3
Raincoat 87.9
Tarmac 83.7
Asphalt 83.7
Shoe 78.6
Footwear 78.6
Overcoat 69.6
Shoe 68.7
Walking 64.2
People 61.1
Road 57.6

Imagga
created on 2022-02-26

crutch 68.4
staff 52.9
stick 49.2
sport 38.3
man 32.9
male 25.6
outdoors 19.4
grass 19
active 18.9
person 18.7
outdoor 18.3
athlete 18.2
people 17.8
pedestrian 16.8
competition 16.5
ball 16
lifestyle 15.9
leisure 15.8
sports equipment 14.9
exercise 14.5
adult 14.3
road 13.5
golf 13.4
walking 13.3
summer 12.9
recreation 12.5
activity 12.5
senior 12.2
sports 12
field 11.7
course 11.7
park 11.5
men 11.2
playing 10.9
game 10.7
clothing 10.3
play 10.3
outside 10.3
street 10.1
protection 10
polo mallet 9.9
vacation 9.8
destruction 9.8
accident 9.8
military 9.7
track 9.6
wheelchair 9.4
day 9.4
city 9.1
fitness 9
golfer 8.8
nuclear 8.7
couple 8.7
uniform 8.7
standing 8.7
run 8.7
gas 8.7
retirement 8.6
running 8.6
hobby 8.5
child 8.4
portrait 8.4
player 8.3
action 8.3
speed 8.2
fun 8.2
danger 8.2
industrial 8.2
mallet 8
stalker 7.9
radioactive 7.9
radiation 7.8
boy 7.8
soldier 7.8
disaster 7.8
toxic 7.8
protective 7.8
chemical 7.7
mask 7.7
old 7.7
sky 7.7
health 7.6
drive 7.6
enjoy 7.5
equipment 7.5
smoke 7.4
ballplayer 7.2
dirty 7.2
spring 7.1
travel 7
cricket equipment 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.5
black and white 95.2
person 86.2
clothing 86
text 82.6
footwear 82.1
man 74.9
monochrome 73
posing 41

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Dog 88.3%
Shoe 78.6%

Captions

Microsoft

a group of people posing for a photo 80.1%
a group of people posing for a picture 80%
a group of people posing for the camera 79.9%

Text analysis

Amazon

DAEK

Google

YT3R2 XAG
YT3R2
XAG