Human Generated Data

Title

Untitled (four men moving boxes onto fork lift while man in suit watches)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9430

Human Generated Data

Title

Untitled (four men moving boxes onto fork lift while man in suit watches)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 98.6
Person 98.5
Person 96.4
Person 94.6
Furniture 73.4
Text 63.5
Box 56.2
Machine 55.1

Imagga
created on 2022-01-23

box 100
carton 94.5
container 66.3
warehouse 48.7
boxes 36.1
cardboard 30.7
package 26.8
moving 22.9
3d 21.7
delivery 21.4
stack 21.3
crate 21.1
home 20.7
packaging 20.4
furniture 20.3
interior 19.5
business 19.4
house 19.2
storage 19
shipping 18.5
file 17
empty 15.5
new 15.4
open 14.4
office furniture 13.9
industry 13.7
cargo 13.6
transportation 13.5
modern 13.3
object 13.2
distribution 12.8
paper 12.5
brown 12.5
apartment 12.5
room 12.4
floor 12.1
transport 11.9
man 11.4
estate 11.4
board 11.3
render 11.3
equipment 11.2
work 11
building 10.8
packing 10.8
pack 10.7
pile 10.3
architecture 10.2
design 10.1
present 10
industrial 10
merchandise 9.9
structure 9.8
send 9.8
fragile 9.7
indoors 9.7
flat 9.6
recycle 9.6
gift 9.5
office 9.5
construction 9.4
card 9.4
closed 8.7
male 8.5
stock 8.4
city 8.3
group 8.1
shipment 7.9
freight 7.8
mail 7.7
block 7.6
post 7.6
heap 7.5
technology 7.4
adult 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

building 86.1
toy 84.9
skyscraper 68.8
house 68.3
black and white 57

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 99.7%
Sad 95.7%
Confused 1.6%
Calm 1.4%
Angry 0.6%
Disgusted 0.4%
Surprised 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 24-34
Gender Male, 89.5%
Disgusted 69.6%
Sad 12.8%
Angry 7.9%
Calm 3.9%
Confused 2%
Surprised 1.8%
Fear 1.2%
Happy 1%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 97%
Surprised 1.2%
Happy 0.6%
Sad 0.5%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Male, 86.2%
Calm 76%
Sad 17.6%
Confused 2.8%
Angry 1.5%
Surprised 0.6%
Happy 0.5%
Fear 0.5%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people in front of a building 52.2%
a group of people standing in front of a building 45.7%
a group of people sitting in front of a building 34.3%

Text analysis

Amazon

3
8 3
8
L
the L
the
SE
AUTO

Google

SOSS ON
SOSS
ON